Tuesday, March 12, 2019

Are filter bubbles isolating us intellectually?

Not many of us understood what ‘Filter Bubble’ meant, when Eli Pariser coined the phrase, in 2011.

The Internet activist had said, in his TED talk and also in his book, that the content we see online is now being customized – in fact, personalized, to our own likes and preferences - to such a high degree, that it could be highly detrimental to our collective future.

Picture courtesy: Sticky Digital

Today, what we see, in our searches online, in our social media news feed, and in our web-based interactions, is all based on algorithms written into the software of the companies whose services we use.

These algorithms give us the information we personally prefer, and also target us with advertising, supposedly, relevant to us.

Google, Facebook, Twitter, YouTube, Instagram, and Netflix are all among the many services which now try to read our minds and give us what we want. Or, at least, that’s what they say they do.

But, in the process, they are confining us to our own bubbles – filtering from us, other knowledge that could be useful and important to us.

I believe that in their attempts to make information relevant to us, these companies are ignoring our overall personality; and more importantly, they are snubbing our innate desire - to know more than that we are being deliberately exposed to.

Increasing my exposure to posts that reinforce my previous preferences, and decreasing my exposure to posts that conflict with my personal interests, can deliberately make me live in a make-believe world; a filter-bubble.

The 2018 Facebook–Cambridge Analytica data scandal showed us that Cambridge Analytica had harvested the personal data of millions of people's Facebook profiles without their consent and had used it for political purposes. It swayed opinions.

It brings us to some serious ethical questions we need to answer - on whether we are responsible for creating the filter bubble ourselves, or whether the social media companies, and their partners are conniving to keep us in that bubble; in the way they want us to.

In the past, when newspapers were the only medium through which people got information, the editors had served as real gatekeepers. They screened news – okay, let’s say they filtered news – in order to give their readers what was relevant and useful.

Some newspapers which took pride in environments of a free press, and in the age of independent journalism, even published anti-establishment pieces and tried to become the ‘vox populi’.

Most newspapers adhered to journalistic codes of ethics and became responsible channels which sought out and promoted values such as truth, justice, liberty, and equality.

Today, sadly, fed by unverified social media posts and forwards – and fed-up by the political leanings of some newspapers and TV news channels – the public is losing trust in news platforms, including those on the Internet.

Responsible Internet companies should ensure that whatever information passes through their webpages and apps is trustworthy and reliable. They must allocate adequate resources to fight the spread of ‘fake news’ and ‘rumour’. They must allow governments to hold them accountable.

In these days of high security and privacy – especially, with end-to-end encryption between any two individuals communicating over the Internet – it could be difficult. But, at least regulating social media groups and public posts can make us empowered with balanced knowledge.

In his latest article titled “The Internet Can Make Us Feel Awful. It Doesn't Have to Be That Way” (TIME, 17 Jan 2019) Eli Pariser tells tech companies to “focus on user empowerment and a genuine respect for his or her desires rather than –manipulation”.

Some of us may choose to live in our own filter bubbles. 

But all of us must have the choice to break free.

No comments:

Post a Comment