facebook

Election Night: Is there an echo?

Election night, Australia 2013.

Instead of switching on the telly when I got home (post the Brisbane Writers' Festival Great Debate which was much fun...) I flipped my laptop screen open and basked in the cool light of my facebook newsfeed.

Oh the woe.

But then the thought: If almost everyone I know is upset by the election results as per my newsfeed, who on earth voted for Tony Abbott?

(Correction. Who voted for the LNP? Such presidential style language.  A political party is a party is a party not a person).

Curious.  I think I saw only one congratulatory status update.  It could have been because all the LNP supporters were hitting the town in celebration and excitement, but the following few days saw little change in the tone of my feed (with occasional bursts about Syria).

It reminded me of the book published a few years back: "The Filter Bubble: What the Internet is hiding from You", which talks about the Google and Facebook algorithms that 'personalise' what you see in search results.  

The synopsis paints a dystopian picture.

[box] Though the phenomenon has gone largely undetected until now, personalized filters are sweeping the Web, creating individual universes of information for each of us. Facebook-the primary news source for an increasing number of Americans-prioritizes the links it believes will appeal to you so that if you are a liberal, you can expect to see only progressive links. Even an old-media bastion like The Washington Post devotes the top of its home page to a news feed with the links your Facebook friends are sharing. Behind the scenes a burgeoning industry of data companies is tracking your personal information to sell to advertisers, from your political leanings to the color you painted your living room to the hiking boots you just browsed on Zappos. In a personalized world, we will increasingly be typed and fed only news that is pleasant, familiar, and confirms our beliefs-and because these filters are invisible, we won't know what is being hidden from us. Our past interests will determine what we are exposed to in the future, leaving less room for the unexpected encounters that spark creativity, innovation, and the democratic exchange of ideas.[/box]

Is it really that bad? Should we be scared? Digging  a little deeper, it would seem that the Facebook PR machine has been doing a little work of its own to combat this image, publishing a study that disputes this claim.

But on this I am...undecided.  What is it that we are arguing exactly? That the 'online echo chamber' doesn't exist, or arguing about the effects of personalisation on creating such a 'chamber'?

Well, the personalisation exists, there is no question.  What effect is this having? Well...

Facebook's research concludes:

[box] Although we’re more likely to share information from our close friends, we still share stuff from our weak ties—and the links from those weak ties are the most novel links on the network. Those links from our weak ties, that is, are most likely to point to information that you would not have shared if you hadn’t seen it on Facebook. [/box]

This links to the concept of EdgeRank (Facebook's algorithm for its newsfeed) .  As I understand, it says it doesn't matter how close you are to people in real life, what appears on your feed is what you interact with - whether they are 'strong' or 'weak' ties.  Because we share things from our 'weak' ties, it means this is likely to be information we wouldn't have accessed any other way.  Therefore, they are saying that in fact, the newsfeed system actually diversifies what you see.

I would recommend reading this slate article, which is a good summary of the research and findings.  I tend to the opinion that it isn't as fabulous a result as it is painted to be.  Although both strong and weak ties are sources of information, it would seem likely that weak ties would also include those who largely share ideologies...

Regardless of what the research says though, something doesn't feel right.

Yes, it is great that my friends seem to share my ideologies (judging by what I see/read on facebook for example), but the fact that rarely are widely differing ideologies  presented (or any that seem to reflect a popular ideology outside my immediate circles) seems disingenuous.  There might be other answers.  Perhaps my strong and weak ties are by and large young people who have similar concerns on the whole or those who don't share my opinions don't spend their time on facebook, or I don't actively spend my time interacting with perspectives/videos/links outside my ideologies so I don't get shown these on my newsfeed...

The question then becomes, how do I make sure that I don't fall into the trap of group think? Punters have talked about the narrowing of perspectives that is caused by a possible filter bubble, but what concerns me more is the increased likelihood of 'willful blindness'.

If everyone around you agrees or shares a similar world view, how will you be exposed to 'disruptive' views?

It is reminiscent of the story of a scientist whose partner's sole job was to disprove her theories and find the flaws until the theory was solid and foolproof. Only then did they publish the work.

It is so important for us to actively listen to opposing views and try to understand where they are coming from, right?  Isn't that the way we will truly broaden our scopes and try to bridge those chasms? Otherwise we are just looking at shades of the same primary colour, forgetting there are two other colours out there...

The Quraan says:

[box] O mankind! We have created you from a male and a female, and made you into nations and tribes, that you may know one another. (49:13) [/box]

We are all different, but ultimately human.  Getting to know each other is part of the deal.

What do you think?