[dropcap]T[/dropcap]he danger involved with this can hardly be overstated. Not to know one is blind is an excruciating blindness. If Eli Pariser is right we are all suffering from a blindness of which we are not aware.
The danger about which Pariser (also co-founder of Upworthy.com) warns revolves around how our Internet searches are filtered to match our preferences. He calls it “the filter bubble.” The algorithms behind Facebook search, Facebook newsfeed, Google, Bing, etc, pre-filter our results based on what we like, with whom we have interacted online, where we live, from where we are accessing the Internet, the operating system and browser we use, and more. Results are then shown to us not as unfiltered data, but prioritized as data the algorithm deduces we most likely want to see. So news stories are fed to us because they are the type of story we read most often, not because they are the most comprehensive, most accurate, or from a reliable source. Most people have certain political or religious leanings. Filtering can cause the erroneous assumption that more people think like us than actually do.
This is not a long video, and it is worth your time.