It’s likely that the quality of your internet research has slowly been degrading over the past 5 years. Without your knowing it, something called the filter bubble has been limiting your discovery process. causing you to have a limited world view. As a result, your livelihood could be at stake.
The websites we rely on for our research now use algorithms to filter the results to be more pleasing to us. Whether it’s the search engines we use or the news sites we refer to for resources, they all serve up what they think we want to see rather than a true spread of what’s really out there.
It’s called “personalized search” and it’s a problem not only for writers but for anyone who wants to maintain an educated point of view on anything and everything. Eli Pariser first warned us of this phenomenon in 2011 with his ground-breaking book “The Filter Bubble” and minds have been narrowing ever since.
A Filter Bubble recap.
Personalized search results in what’s called “The Filter Bubble” and most social commentators agree that this bubble leads to a diminished worldview. To anyone who’s in favor of enlightenment, life-long education and a better world, the filter bubble is drastically horrifying.
But how does it work?
The filters are driven by data about you such as the following:
- your browser history- are you more likely to click on “this” or “that”?
- other social factors like where you live…
- where you shop
- the browser you use
- the language you use
- the ads you click
- how many typos you make (!)
- how often you search for yourself
- where you get your news and what types of stories you click on most often
- your interests- cat pictures? bodybuilding? travel?
When you perform an internet search, the algorithm takes all this information about you (Google uses 57 points of data!**), pushing to the top things they think you’ll be more likely to click on while the others sink to the bottom.
And it happens more than you’d think. Even Facebook filters your feed for you. Most news and search websites as well as shopping sites like Amazon use these algorithms, and they’re all doing it for improvements in relevance.
The problem with “relevance”…
As Eli Pariser pointed out five years ago, relevance is not the only concern when you’re searching the web. It’s good to find exactly what you’re looking for when you search for things online, but on the other hand filtering means you experience less and less “discovery”, which is what opens your mind and educates you.
What if we’re being served only a limited selection (only what’s pleasing), rather than the wonderful, rich variety of info that exists in a wonderfully diverse world? (We are).
That means, for example, that when you’re on Huffington Post you’re only being served topics that you’d find personally relevant, rather than globally important. For writers- especially bloggers- this is akin to serving an athlete only carbs. It’s starvation.
Filtering lends itself to our natural state of confirmation bias.
Filtering also puts us in danger of having a disturbingly incorrect idea of what’s going on the world around us. For writers, that’s alarming news.
Humans naturally gravitate to what’s familiar and similar to themselves. Our friends are likely to have similar political views and we all have a news source we prefer. We even hire people we find likable and familiar. It’s called the “confirmation bias”.
“By and large, the focus is on peers, school, parents and reality TV.”
-psychologist & filter bubble specialist Michael Carr-Gregg*
Apply a narrowing algorithm on top of our natural tendency toward the familiar and you end up with an even narrower array of topics and opinions. In short, your world becomes very very small…provincial.
But for writers, it’s an especially creepy notion to think that our knowledge is being limited by a formula based on who our friends are and what type of soap we buy. It has huge implications for the quality of our work.
What can we do?
For now, there’s not much we can do about the filter bubble except be aware that it exists. You can rest assured that some of the brightest minds are working on the issue- including some at Yahoo Labs. Over there, they’re busy developing a “recommendation engine” which should lead users to other users who have opposing views yet similar interests.
You could try using DuckDuckGo, an alternative search engine that claims to “break you out of your filter bubble”. But if you’re like me, you’ll always have the sneaking suspicion you’re missing some results and be tempted to recreate your search on Bing.
You could also use a browser extension called “Bobble”. It blocks the personalization algorithm so you get unbiased results when you conduct searches. It’s available only for Chrome, which is a Google product.
It’s a good start, but it won’t help you with everything else that’s filtered for your convenience…your Facebook feed, your Amazon shopping search results, your Netflix account and a growing legion of other sites now using personalized results.
The takeaway from all this? You’ve already got it: awareness. Now that you’re aware of how personalization can lead to provincialism, you can fight it with everything you’ve got.
** Source: http://www.rene-pickhardt.de/google-uses-57-signals-to-filter/