Monday 21 November 2016

* Echo chamber: You are what you read

https://medium.com/
Tobia Rose-Stockwell

The thing that has become the most clear to us this election year is that we don’t agree on the fundamental truths we thought we did.

I went to college in the part of Pennsylvania that definitely flipped the state for Trump. A good number of my friends are still living there, and have posted messages from what seems at this moment in history to be a completely different country.

Over the last several weeks I have watched dozens of my friends on Facebook de-friend one another. I have seen plenty of self-righteous posts flow across my news feed, along with deeply felt messages of fear, anger and more recently — existential despair.

On the other side I see reflections of joy, levity, gratitude and optimism for the future. It could not be more stark.

The thing that both groups have in common is very apparent: A sense of profound confusion about how the other side cannot understand their perspective.

This seemed to be building on a trend in social media that hit full tilt in the lead up to the election: Political divisions between us are greater than they ever have been, and are still getting worse by the day.

I don’t believe that the Media Elite, Donald Trump or the Alt Right are to blame for the state of our politics. They peddle influence and ideas, but they don’t change the actual makeup of our country. Elected officials are still a fairly accurate representation of voters’ wishes.

I also don’t believe this is inherently a reaction to the political overreach of the status quo. This discontent is part of something felt outside of our borders too. You do not have to look far to see this rising tide of hyper-nationalism going international.

The reason is much more subversive, and something we really haven’t been able to address as humans until now. I believe that the way we consume information has literally changed the kind of people we are.

How did we get here?

For much of the 20th century and into the 21st, we had a very small handful of channels through which to consume things like the news. (Advance warning: for the sake of brevity, I’m going to gloss over a lot.

We had the big 3 TV networks, and a number of regional papers and radio stations that pumped out the majority of what we watched, read and listened to.

When politicians did things wrong, journalists competed to ask questions and report on it — and scoop each other on The Facts. When a claim of biased reporting was leveled, it was considered a pretty big insult.

Having a single source of news also had its drawbacks — it was basically a monopoly, which allowed for fewer opinions that deviated outside of the mainstream.

This media pipeline was so important in politics that there was a law passed in 1927 called the Equal-Time rule, which stated that any candidate for political office that was given a prime-time spot for Radio or TV must be given equivalent airtime. Remember that.

The Invention of the Private Personal Pipeline

When the internet came along, it was heralded as a new way to democratize this traditional monopoly on The Facts. People generally thought this was a great thing, and a way to expose us to a diverse new range of opinions.

About a decade ago, a few new startups began giving us reasons consume media by being online all the time. The ones that we know really well are Facebook and Twitter, but we’re mostly going to talk about Facebook here. It went from zero to a billion users in less than 8 years, and has essentially changed humanity’s relationship with the internet.
The most significant thing they built was your personal pipeline — the News Feed. It quickly changed from a fairly simple way to read posts from your friends to one based on a much more complicated algorithm that optimized for ‘engagement.’

As you know already, Facebook got really good at this. Their sorting algorithm became the primary method to serve us every type of content. It blew past Twitter and every other media channel (and is likely how you’re reading this article now).

Very suddenly, people realized this feed was way more important than the Big 3, newspapers, or radio ever were. A lot of people stopped buying papers or watching the news on TV. Everyone began to piggyback on this algorithm because it did such a good job of keeping people’s eyeballs online and happy.

But those eyeballs stopped caring as much about the big brand name news sites, because there were plenty of little news sites for us to read. Many of those sites had a more squishy relationship with journalism.
And as long as what the articles said made us feel pretty good and looked the same as traditional news, we kept reading them. For the first time, we suddenly had plenty of choice on The Facts.

You Are What You Read

If you are an average American with access to the internet, you consume a big portion of your news through Social Media — 62% of us get news this way. Facebook’s news feed is now the primary driver of traffic to news sites.

Most of the events that you read about will come through this feed. Most of your opinions will be shaped by it. This is a stream of information that is curated and limited by the things that will not make you uncomfortable — and certainly will not provide equal airtime to opposing viewpoints.

This news feed is a bubble, and the things that filter through are the things that do not challenge you. This is a version of what internet activist Eli Pariser called the Filter Bubble.

The Wall Street Journal recently built a tool that illustrates just how radically this has allowed for us to self-select the bubbles of our facts. Red Feed Blue Feed creates two custom news feeds based on the exact same topic (say, Michelle Obama) from conservative and liberal news sites on Facebook, and displays them side by side. It shows how easily one can become insulated inside a stream of news that confirm our assumptions and suspicions about the world, just by algorithmically tailoring the people and pages we follow.

We Prefer Information Ghettos

There is a funny quirk in our nature that psychologists call Confirmation Bias. It’s a real thing, and you can see people fall into it all the time. It is the natural human tendency to interpret new information as confirming our existing beliefs or theories. When we have a choice to read news that confirms our worldview or challenges it — we almost always choose the former, regardless of the evidence.

Since we feel uncomfortable when we’re exposed to media that pushes back on our perspective (like that weird political uncle you see at a family reunion), we usually end up avoiding it. It requires a lot of effort to change opinions, and generally it feels gross to have difficult chats with people that don’t agree with us. So, we politely decline the opportunity to become their friend, buy their product, read their magazine, or watch their show.

We insulate ourselves in these ‘information ghettos’ not because we mean to, but because it’s just easier.

Our own Facebook feed is no different. It is a manifestation of who we are. It was created by us: by the things we have liked in the past, by the friends we have added along the way, and by people that tend to have opinions a lot like ours. It is made by us.

This is self-segregation, and it happens naturally. But the success of Facebook’s algorithm has effectively poured gasoline on this smoldering innate bias.

The Problem with Community

But what about community? Facebook (and the internet in general) has done an amazing job at helping people find community. It has given us a way to connect with our best-matching, most specific, perfectly fitting counterparts online. From Furby collectors to exotic mushroom cultivators to the Alt Right, there is a place for everyone.

But there is a flaw in how we see community. As humans, we evolved in small tribes and rarely saw really large groups of other people. Because of this, we are bad at instinctively understanding the difference between ‘big’ numbers and ‘huge’ numbers. In any kind of physical setting, the difference between many thousands of people and many millions of people is actually impossible for us to see.

Online this has allowed us to insulate ourselves entirely within groups that may be a tiny fraction of our nation, without ever seeing another side. We instinctively feel like this is representative of a majority.

These online communities — to us — might seem to be purveyors of truth that embody The Facts better than anywhere else. They also might feel like they are enormous — thousands of people might agree with you. But that doesn’t make them true, or a majority opinion.

Contact Increases Empathy, Insulation Kills It

In social psychology there is a framework called the Contact Hypothesis, which has shown that prejudice is reduced through extended contact with people that have different backgrounds, opinions and cultures than ourselves. Developed by psychologist Gordon Allport as a way to understand discrimination, it is widely seen as one of the most successful tools for reducing prejudice and increasing empathy. It is a measurable and time-tested way of helping people get along.

The more time you spend with others that are different from you in an environment that is mutually beneficial, the more you will understand them. The more you understand them, the less prejudice and implicit bias you will have.

This contact is important in the context of our social channels. They are designed to let us insulate ourselves from the people and opinions we would prefer not to see.

We must agree on The Facts in order to co-exist

Facebook has stated that their mission is to make the world a more open and connected place. And they have, by anyone’s measure, connected more humans than any company in history.

With this success, they have also created a tool that has allowed us to become more insulated in our own ideological bubbles than we ever have been before.

Because of this lack of pluralism, we are systematically losing our ability to empathize. This is what we now see in the wider world — from Brexit to Trump to hyper-nationalistic movements worldwide. People globally no longer have the same incentives to find a shared understanding. This is not just dissatisfaction with globalization or the status quo. This is how we are changing our society by not seeing each other.

The precursor to building walls around nations is building walls around ideas

A reasonable understanding of The Facts is necessary for a concept we don’t really think about very much these days: Compromise. Compromise is what leads to consensus, and consensus is what allows for democracy.

It is not always joyous or exuberant. It doesn’t always feel good to require ourselves to care about other people’s opinions, needs and desires — especially when they don’t agree with our own. But this is what democracy is: a decision to live within a shared idea of the future. A mutual attempt at the hard civility of real compromise in order to keep moving forward together.

We need a moment for catharsis.To breathe. To cry. To be relieved, or to be angry.

But we need to also remember this — If we cannot build the tools of our media to encourage empathy and consensus, we will retract further into the toxic divisions that have come to define us today.

That careful consensus is the foundation upon which democracy is created — a sober understanding that allows for us to act as one whole. An attempt to find mutuality in our imperfections and differences, with the trust that we are together more extraordinary than our individual parts.
How we can do this better:

Ways to increase your political empathy online

  • Expose yourself to alternative opinions — Read the other side: Your news sources likely have their own bias baked right in. There is no better way of unpacking your own beliefs than exposing yourself to the news sites that disagree with you.
  • Examine the source of news for bias and factual inaccuracy before you share it — Cultivate a healthy skepticism when you see an exciting headline that comes from a website you haven’t heard of. Many of these posts are designed to appeal to hyper-partisanship in order to get you to share them.
  • Engage with people who are different from you when you can — Don’t delete the friends on Facebook that disagree with you (Trolls excepted). You will not ‘pollute’ your worldview by talking to them and trying to understand their perspective. Expend the extra effort to go through a civil discourse, build common ground and avoid a shouting match.
What Facebook can do:

(Warning — this gets nerdy)

Facebook should do more to prioritize posts that come from verified sources. It should functionally de-prioritize/flag sites that peddle in fake news (easy to implement) and even hyper-partisan news from both sides (harder). This editorial process should be neutral. A news feed that is optimized for engagement is essentially the algorithmic equivalent of “it bleeds it leads” — this is problematic when journalistic due process is missing from a huge portion of web-based news.

Consider Equal Air Time (or Equal Attention). Facebook knows exactly how long you spend consuming the media you do on their platform. They also know how partisan you are (or are likely to be), how old you are, and the kind of media you like. If the content you consume is exclusively partisan (as determined on a per-source basis above), Facebook should rank this transparently, and allow space for sources with opposing political views to enter your feed (demographically, your pool of “friends of friends” can cross into a large range of political perspectives).

Ensure exposure to median viewpoints, not just opposing ones. The companies that dictate our diet of information must have fail-safes to keep us from isolating ourselves completely inside fully partisan ‘information ghettos’. Using the demographic information Facebook has about us, they can determine just how limited we are to alternative viewpoints, and improve our access to posts outside our immediate social graph. This does require a tagging mechanism on the assumed partisanship of various news sources and articles, but is doable.

Finally, Facebook should be more open with how its algorithm editorializes the types of content we see. Being transparent about this methodology will reduce any claims of partisanship and bias. A solution to this, and all of these problems, can be found without compromising its IP.
Facebook’s success has turned it into one of the most powerful tools we have for connecting to other humans. Something this powerful, which has come to deserve so much of our attention, also deserves our scrutiny in equal measure.

No comments:

Post a Comment

Comments are moderated and generally will be posted if they are on-topic.