When you have a feeling that you might already understand something, but are not quite sure, you tend to go searching for information that will confirm your suspicions. When you find that confirming piece of information, you become satisfied that you were correct all along and you stop searching. Once you believe you’ve made sense of something, you feel that you need not continue your efforts and you stop your pursuit of new knowledge. When you depend on the Internet for information, such conformation bias is all-pervasive.
The search sector has tremendous power to determine what we see, where we spend, how we perceive the information we find, etc. In The Black Box Society, Frank Pasquale quotes George Dyson as saying in his book Turing’s Cathedral, “Facebook defines who we are, Amazon defines what we want, and Google defines what we think.” Eventually people may give algorithms the authority to make many important decisions. An illustration is the change in how we choose books in a bookstore and how we choose books in Amazon. When people go to a bookstore, they flip through one book and read the first few sentences of another, until some gut feeling connects them to a particular title.
In the Amazon virtual store, a message pops up and tells them: “I know which books you liked in the past. People with similar tastes also tend to love this or that new book.” Devices such as Amazon’s Kindle are able constantly to collect data on their users while they are reading books. Your Kindle can monitor which parts of a book you read quickly, and which slowly; on which page you took a break, and on which sentence you abandoned the book, never to pick it up again. If Kindle was to be upgraded with face recognition software and biometric sensors, it would know how each sentence influenced your heart rate and blood pressure. It would know what made you laugh, what made you sad, what made you angry. Soon, books will read you while you are reading them. Such data should eventually enable Amazon to choose books for you with uncanny precision.
On one side is Google’s well-known motto – ‘Do no evil’. On the other hand, its former CEO Eric Schmidt once said that “Google policy is to get right up to the creepy line and not cross it.” In Black Box Society Frank Pasquale says, ‘It is probably more accurate to say that he and other Silicon Valley leaders don’t want to be caught crossing the creepy line. As long as secrecy can be used to undermine market competition and law enforcement, they will be emboldened to experiment with ever-creepier, more intrusive, and even exploitative practises.’
Algorithm driven programs popularize more extreme views. People with more extreme views are more likely to express their feelings through clicks, likes and postings than moderates. Over time, the algorithm figures out which box you fit into and tailors suitable results towards you. (It will be called 'enhancing user experience' . And when you couple this with the future of fake news, you will have a mess.) Moderates will give a lot fewer data points for the algorithm to work with and so the targeting will be less precise.
The way Google’s search function operates ensures that people live in their own bubbles and are satisfied that the fragmented knowledge they get represents the whole truth. Google installs a cookie in your computer by default and collects the maximum possible information. Google knows the general location of the user from the IP address and his tastes and preferences from his previous usage pattern. Google will customize its search results according to such information so the results for the same search string at different locations and different users may not be the same.
Hiding behind labels like ‘objectivity’ and ‘neutrality’, Google creates a new reality by the way it slices and presents reality. Google ranks pages according to the number of links they get and they proclaim that their search results show that ‘democracy on the web works’. But some have the resources to generate more links perhaps by paying influential sites to link to it. There are many ways to game the system. Evgeny Morozov writes in To Save Everything, Click Here: The Folly of Technological Solutionism, ‘The neutrality defense is bunk – and the sooner Google itself acknowledges this and finds a way to exercise its newly found powers responsibly, the fewer mistakes . . . it will commit in the future.'
In an interview with the Wall Street Journal in August 2010 (quoted in Googlisation of Everything), Google’s chief executive officer, Eric Schmidt, said, “I actually think most people don’t want Google to answer their questions. They want Google to tell them what they should be doing next. . . . We know roughly who you are, roughly what you care about, roughly who your friends are.” As Google learns more about our search histories, and customizes the search results through its estimation of our interests, we will increasingly find ourselves in a bubble. You will never encounter the unexpected, the different, the ‘Other’. You will only get information that fits your prior beliefs.
There are ways to remove customization but most people are not technically savvy enough to perform the necessary actions so they are stuck with Google's default option which is to gather whatever information is possible about an individual. Siva Vaidyanathan says in The Googlisation of Everything, ‘Over time, as users in a diverse array of countries train Google’s algorithms to respond to specialized queries with localized results, each place in the world will have a different list of what is important, true, or “relevant” in response to any query.’ So although information has been made available to everyone in theory, walls get built up in practice.
We tend to think that when people take decisions after discussing an issue in a group, an 'average' of the group view emerges. But this is not what happens. People take more extreme views when in a group rather than when they are alone, a phenomenon known as group polarization. Many studies from different parts of the world have shown the phenomenon of group polarization in action. For example, after a group discussion, people already supportive of a war become more supportive, people with an initial tendency towards racism become more racist. This phenomenon also occurs in online discussion.
During his presidential campaign, Donald Trump said that a man who tried to rush the stage during a rally had ties to ISIS. When he couldn’t produce any evidence, he said that all he knew about it came from what he saw on the Internet. When the most powerful man on the planet believes the information on the Internet, it can't be comforting. As Clay Shirkey says in his post A Speculative Post on the Idea of Algorithmic Authority, ‘There’s a spectrum of authority from “Good enough to settle a bar bet” to “Evidence to include in a dissertation defense”, and most uses of algorithmic authority right now cluster around the inebriated end of that spectrum…’
More data makes us feel that we can make more accurate predictions. But those predictions change human social and political behaviour thus negating those same predictions soon after they are made. Thus more data will make the world more complex and unpredictable than before. I keep recollecting a statement by Gandhi in Hind Swaraj – 'I am prepared to maintain that humbugs in worldly matters are far worse than the humbugs in religion.' And the ‘silly’ old man was right (which doesn’t surprise me anymore). Ironically, the 'modernist' vision of Nehru has become controversial while the 'obsurantist' Gandhi has become more relevant. Joseph Brodsky writes in Less Than One:
The search sector has tremendous power to determine what we see, where we spend, how we perceive the information we find, etc. In The Black Box Society, Frank Pasquale quotes George Dyson as saying in his book Turing’s Cathedral, “Facebook defines who we are, Amazon defines what we want, and Google defines what we think.” Eventually people may give algorithms the authority to make many important decisions. An illustration is the change in how we choose books in a bookstore and how we choose books in Amazon. When people go to a bookstore, they flip through one book and read the first few sentences of another, until some gut feeling connects them to a particular title.
In the Amazon virtual store, a message pops up and tells them: “I know which books you liked in the past. People with similar tastes also tend to love this or that new book.” Devices such as Amazon’s Kindle are able constantly to collect data on their users while they are reading books. Your Kindle can monitor which parts of a book you read quickly, and which slowly; on which page you took a break, and on which sentence you abandoned the book, never to pick it up again. If Kindle was to be upgraded with face recognition software and biometric sensors, it would know how each sentence influenced your heart rate and blood pressure. It would know what made you laugh, what made you sad, what made you angry. Soon, books will read you while you are reading them. Such data should eventually enable Amazon to choose books for you with uncanny precision.
On one side is Google’s well-known motto – ‘Do no evil’. On the other hand, its former CEO Eric Schmidt once said that “Google policy is to get right up to the creepy line and not cross it.” In Black Box Society Frank Pasquale says, ‘It is probably more accurate to say that he and other Silicon Valley leaders don’t want to be caught crossing the creepy line. As long as secrecy can be used to undermine market competition and law enforcement, they will be emboldened to experiment with ever-creepier, more intrusive, and even exploitative practises.’
Algorithm driven programs popularize more extreme views. People with more extreme views are more likely to express their feelings through clicks, likes and postings than moderates. Over time, the algorithm figures out which box you fit into and tailors suitable results towards you. (It will be called 'enhancing user experience' . And when you couple this with the future of fake news, you will have a mess.) Moderates will give a lot fewer data points for the algorithm to work with and so the targeting will be less precise.
The way Google’s search function operates ensures that people live in their own bubbles and are satisfied that the fragmented knowledge they get represents the whole truth. Google installs a cookie in your computer by default and collects the maximum possible information. Google knows the general location of the user from the IP address and his tastes and preferences from his previous usage pattern. Google will customize its search results according to such information so the results for the same search string at different locations and different users may not be the same.
Hiding behind labels like ‘objectivity’ and ‘neutrality’, Google creates a new reality by the way it slices and presents reality. Google ranks pages according to the number of links they get and they proclaim that their search results show that ‘democracy on the web works’. But some have the resources to generate more links perhaps by paying influential sites to link to it. There are many ways to game the system. Evgeny Morozov writes in To Save Everything, Click Here: The Folly of Technological Solutionism, ‘The neutrality defense is bunk – and the sooner Google itself acknowledges this and finds a way to exercise its newly found powers responsibly, the fewer mistakes . . . it will commit in the future.'
In an interview with the Wall Street Journal in August 2010 (quoted in Googlisation of Everything), Google’s chief executive officer, Eric Schmidt, said, “I actually think most people don’t want Google to answer their questions. They want Google to tell them what they should be doing next. . . . We know roughly who you are, roughly what you care about, roughly who your friends are.” As Google learns more about our search histories, and customizes the search results through its estimation of our interests, we will increasingly find ourselves in a bubble. You will never encounter the unexpected, the different, the ‘Other’. You will only get information that fits your prior beliefs.
There are ways to remove customization but most people are not technically savvy enough to perform the necessary actions so they are stuck with Google's default option which is to gather whatever information is possible about an individual. Siva Vaidyanathan says in The Googlisation of Everything, ‘Over time, as users in a diverse array of countries train Google’s algorithms to respond to specialized queries with localized results, each place in the world will have a different list of what is important, true, or “relevant” in response to any query.’ So although information has been made available to everyone in theory, walls get built up in practice.
We tend to think that when people take decisions after discussing an issue in a group, an 'average' of the group view emerges. But this is not what happens. People take more extreme views when in a group rather than when they are alone, a phenomenon known as group polarization. Many studies from different parts of the world have shown the phenomenon of group polarization in action. For example, after a group discussion, people already supportive of a war become more supportive, people with an initial tendency towards racism become more racist. This phenomenon also occurs in online discussion.
During his presidential campaign, Donald Trump said that a man who tried to rush the stage during a rally had ties to ISIS. When he couldn’t produce any evidence, he said that all he knew about it came from what he saw on the Internet. When the most powerful man on the planet believes the information on the Internet, it can't be comforting. As Clay Shirkey says in his post A Speculative Post on the Idea of Algorithmic Authority, ‘There’s a spectrum of authority from “Good enough to settle a bar bet” to “Evidence to include in a dissertation defense”, and most uses of algorithmic authority right now cluster around the inebriated end of that spectrum…’
More data makes us feel that we can make more accurate predictions. But those predictions change human social and political behaviour thus negating those same predictions soon after they are made. Thus more data will make the world more complex and unpredictable than before. I keep recollecting a statement by Gandhi in Hind Swaraj – 'I am prepared to maintain that humbugs in worldly matters are far worse than the humbugs in religion.' And the ‘silly’ old man was right (which doesn’t surprise me anymore). Ironically, the 'modernist' vision of Nehru has become controversial while the 'obsurantist' Gandhi has become more relevant. Joseph Brodsky writes in Less Than One:
There is something in the consciousness of literati that cannot stand the notion of someone's moral authority. They resign themselves to the existence of a First Party Secretary or of a Fuhrer, as to a necessary evil, but they would eagerly question a prophet. This is so, presumably, because being told that you are a slave is less disheartening news than being told that morally you are zero. After all, a fallen dog shouldn't be kicked. However, a prophet kicks a fallen dog not to finish it off but to get it back on its feet. The resistance to those kicks ...comes not from a desire for truth but from the intellectual smugness of slavery.
No comments:
Post a Comment