According to the prophets, the arrival of the internet was going to be the biggest thing to happen to democracy since the invention of the ballot box. Nothing like the Rwandan genocide could ever happen again, the former British PM Gordon Brown insisted, 'because information would come out far more quickly about what is actually going on and the public opinion would grow to the point where action would need to be taken’. The message was: large doses of information and communications technology are bound to prove lethal to the most repressive of regimes. What the cyber-utopians failed to grasp was that the internet can just as easily be used to control people as it can be used to educate them.
There were massive protests in Iran in 2009 because of suspicions of a fraudulent election. The protests were thought to be fueled by tweets and cyber-utopians lost no time in claiming that the Internet will spell the doom of dictators everywhere and that a liberal democracy was the only game in town. So much so that 'the Internet' was one of the nominations for the Nobel Peace Prize in 2010. What they failed to realize was that tweets don't topple government, people do. A real revolution sooner or later demands sacrifices from the population, not just typing on computers.
After the failed uprising in Iran, the government hunted down dissidents online, tracking them through their emails and using face-recognition technology to identify people from pictures taken on mobile phones. The authorities used technology for their own benefit by sending messages warning Iranians to stay away from street protests. The police searched for personal details like Facebook profiles and email addresses of Iranians living abroad and threatened them not to incite protests unless they wanted to hurt their relatives back home. Governments use social networks to infiltrate protest groups and track down protesters, seeding their own propaganda online.
In 21 Lessons for the 21st Century, Yuval Noah Harrari says that many fear AI algorithms because they think that they will not remain obedient to us. But the problem with algorithms is exactly the opposite - they will always do what they will be ordered to do. If algorithms and robots are in benign hands, they will produce tremendous benefits. However, if countries of the “Axis of evil” embrace new technologies, people might end up in a complete surveillance regime where all actions and utterances are followed and controlled by the future Big Brother and humans will come to live in “digital dictatorships.” Eventually, the population of digital dictatorships, because of extensive propaganda and constant fear of being marked as a dissenter, will come to unconditionally obey the Big Brother. From “1984" by George Orwell:
There already are agencies that trace the timely manner in which we pay our debts, giving us a score that's used by lenders and mortgage providers. There is on eBay a rating on shipping times and communication, while Uber drivers and passengers both rate each other; if your score falls too far, you're in trouble. China's social credit system expands that idea to all aspects of life, judging citizens' behavior and trustworthiness. Caught jaywalking, don't pay a court bill, play your music too loud on the train — you could lose certain rights, such as booking a flight or train ticket. If it is implemented overtly, it doesn't mean that the idea is new or that it doesn't exist elsewhere in more skeletal form.
Supporters of the SCS see this as an opportunity to improve on some of the state’s services. Some argue that this would give Chinese citizens much-needed access to financial services. It's all about building trust, says the Chinese government. The 2014 document describing the government's plans note that as "trust-keeping is insufficiently rewarded, the costs of breaking trust tend to be low." Any technology doesn't come only with benefits; it also comes with costs which its champions would play down. It could paint a very inaccurate and incomplete picture of a person.
People do many different things for many different reasons, and if the context is not appreciated it can be misconstrued. This is what happens when algorithms compute correlations from large amounts of data. Someone who plays video games for ten hours a day, for example, could be considered an idle person. But the reason he was playing games could be because he is a games developer who was testing a new product. A person who is looking at various terrorist organizations could be designated by an algorithm as a person to be watched by security agencies. In reality, he may just be a journalist doing his job. The system can also be used to enforce vague laws like endangering national security or unity.
China has developed advanced facial recognition systems that are able to follow people across entire cities. In a show of power at the end of 2017, Chinese officials working in co-operation with BBC News showed how it could track down and find one of the organisation's reporters within seven minutes. Ultimately, the problem is that “socially acceptable behavior” will be defined by the Chinese government, not a democratic process since it now will have a way of monitoring virtually all aspects of citizens’ lives.
The system the Chinese are putting in place is just an expanded version of what is already in existence in many democratic countries. Police and intelligence agencies are using the databases created by the private sector to revolutionize their own role in society. The government will say that you don’t have to worry if you have nothing to hide. But if your political activities or interests deviate even slightly out of the mainstream, you do. Thousands of people are being caught in data-driven dragnets for being activists, or just belonging to a suspect “identity” group. Careful protection of the boundary between crime and dissent is not a high priority of the intelligence apparatus. FBI director Robert Mueller said way back in 2002, that “there is a continuum between those who would express dissent and those who would do a terrorist act.”
There were massive protests in Iran in 2009 because of suspicions of a fraudulent election. The protests were thought to be fueled by tweets and cyber-utopians lost no time in claiming that the Internet will spell the doom of dictators everywhere and that a liberal democracy was the only game in town. So much so that 'the Internet' was one of the nominations for the Nobel Peace Prize in 2010. What they failed to realize was that tweets don't topple government, people do. A real revolution sooner or later demands sacrifices from the population, not just typing on computers.
After the failed uprising in Iran, the government hunted down dissidents online, tracking them through their emails and using face-recognition technology to identify people from pictures taken on mobile phones. The authorities used technology for their own benefit by sending messages warning Iranians to stay away from street protests. The police searched for personal details like Facebook profiles and email addresses of Iranians living abroad and threatened them not to incite protests unless they wanted to hurt their relatives back home. Governments use social networks to infiltrate protest groups and track down protesters, seeding their own propaganda online.
In 21 Lessons for the 21st Century, Yuval Noah Harrari says that many fear AI algorithms because they think that they will not remain obedient to us. But the problem with algorithms is exactly the opposite - they will always do what they will be ordered to do. If algorithms and robots are in benign hands, they will produce tremendous benefits. However, if countries of the “Axis of evil” embrace new technologies, people might end up in a complete surveillance regime where all actions and utterances are followed and controlled by the future Big Brother and humans will come to live in “digital dictatorships.” Eventually, the population of digital dictatorships, because of extensive propaganda and constant fear of being marked as a dissenter, will come to unconditionally obey the Big Brother. From “1984" by George Orwell:
We do not destroy the heretic because he resists us: so long as he resists us we never destroy him. We convert him, we capture his inner mind, we reshape him. We burn all evil and all illusion out of him; we bring him over to our side, not in appearance, but genuinely, heart and soul. We make him one of ourselves before we kill him. It is intolerable to us that an erroneous thought should exist anywhere in the world, however secret and powerless it may be.An example of of such a digital dictatorship is what is being implemented in China called the 'Social Credit System' (SCS). Every citizen in China would be given a score that will be available for all to see. This citizen score comes from monitoring an individual’s social behavior — from their spending habits and how regularly they pay bills, to their social interactions — and it’ll become the basis of that person’s trustworthiness, which would also be publicly ranked. What people can and can't do, like the kinds of jobs or mortgages they can get, and what schools their children qualify for will depend on how high their "citizen score" is.
There already are agencies that trace the timely manner in which we pay our debts, giving us a score that's used by lenders and mortgage providers. There is on eBay a rating on shipping times and communication, while Uber drivers and passengers both rate each other; if your score falls too far, you're in trouble. China's social credit system expands that idea to all aspects of life, judging citizens' behavior and trustworthiness. Caught jaywalking, don't pay a court bill, play your music too loud on the train — you could lose certain rights, such as booking a flight or train ticket. If it is implemented overtly, it doesn't mean that the idea is new or that it doesn't exist elsewhere in more skeletal form.
Supporters of the SCS see this as an opportunity to improve on some of the state’s services. Some argue that this would give Chinese citizens much-needed access to financial services. It's all about building trust, says the Chinese government. The 2014 document describing the government's plans note that as "trust-keeping is insufficiently rewarded, the costs of breaking trust tend to be low." Any technology doesn't come only with benefits; it also comes with costs which its champions would play down. It could paint a very inaccurate and incomplete picture of a person.
People do many different things for many different reasons, and if the context is not appreciated it can be misconstrued. This is what happens when algorithms compute correlations from large amounts of data. Someone who plays video games for ten hours a day, for example, could be considered an idle person. But the reason he was playing games could be because he is a games developer who was testing a new product. A person who is looking at various terrorist organizations could be designated by an algorithm as a person to be watched by security agencies. In reality, he may just be a journalist doing his job. The system can also be used to enforce vague laws like endangering national security or unity.
China has developed advanced facial recognition systems that are able to follow people across entire cities. In a show of power at the end of 2017, Chinese officials working in co-operation with BBC News showed how it could track down and find one of the organisation's reporters within seven minutes. Ultimately, the problem is that “socially acceptable behavior” will be defined by the Chinese government, not a democratic process since it now will have a way of monitoring virtually all aspects of citizens’ lives.
The system the Chinese are putting in place is just an expanded version of what is already in existence in many democratic countries. Police and intelligence agencies are using the databases created by the private sector to revolutionize their own role in society. The government will say that you don’t have to worry if you have nothing to hide. But if your political activities or interests deviate even slightly out of the mainstream, you do. Thousands of people are being caught in data-driven dragnets for being activists, or just belonging to a suspect “identity” group. Careful protection of the boundary between crime and dissent is not a high priority of the intelligence apparatus. FBI director Robert Mueller said way back in 2002, that “there is a continuum between those who would express dissent and those who would do a terrorist act.”
No comments:
Post a Comment