Check out this talk by Eli Pariser in the TED talks series, dating from February 2011 (Hat tip: Godlike Productions). Pariser cautions that sites like Yahoo!, Google and Facebook use a lot of surprising personal data about us to tailor the kind of information that we find when we conduct online searches, or even when we visit news sites. In other words, according to your browsing patterns, the information on sites like your Facebook profile, and which sites you click on first, your Google search with a particular term will produce different results from someone who does a Google search with the same term. This is also the case with some news sites: according to your site visitation history, you will see a different set of news stories than another person who visits that site.
The filters' gauges of 'relevance' are the selective criteria that limit our freedom of access to information. Pariser feels that this situation leads to individuals not having control over what they can and want to see. It's an invisible mechanism of control, a self-censoring system, which actually makes our worlds smaller and smaller, not larger and larger.
Pariser argues that the algorithms used to control the flow of information to individuals have no embedded system of ethics. The implication he does not explore is the internet's easy potential for tyranny if these conditions reached the extreme end of the spectrum. At any rate, he claims "some" control over "personalization" should be given back to users by news outlets and search engines. And he notices that this situation is identical to the position of the press in 1915, when journalistic ethics were in their infancy. It was that system of professional ethics which regulated how newspaper editors controlled which information reached informed citizens, and which didn't. In his view, this system barely got us through the 20th century. Now, as old-fashioned journalism is being swept away by the internet, Pariser calls on the Information Technology community to develop and incorporate similar standards, while allowing for objectivity and open-mindedness when it designs its virtual gatekeepers.
NOTES FOR READERS OF MY POSTS.
If you're not reading this post on Histories of Things to Come, the content has been stolen and republished without the original author's permission. Please let me know by following this link and leaving me a comment. Thank you.
No comments:
Post a Comment