Saturday, May 18, 2019

Technology Series: (Part Five): 'One Month, 500,000 Face Scans: How China Is Using A.I. to Profile a Minority:' A disturbing article by New York Times reporter Paul Mozur which details how ethnic profiling software developed for use in China can be "easily put" in the hands of other governments..."The facial recognition technology, which is integrated into China’s rapidly expanding networks of surveillance cameras, looks exclusively for Uighurs based on their appearance and keeps records of their comings and goings for search and review. The practice makes China a pioneer in applying next-generation technology to watch its people, potentially ushering in a new era of automated racism."


PUBLISHER'S NOTE: In recent years, I have found myself publishing more and more posts on the  application of artificial intelligence technology to policing, public safety, and the criminal justice process,  not just in North America, but in countries all over the world, including China. Although I accept that properly applied science  can play a positive role in our society, I have learned over the years that technologies introduced for the so-called public good, can eventually be used against the people they were supposed to  benefit. As reporter Sieeka Khan  writes in Science Times:  "In 2017, researchers sent a letter to the secretary of the US Department of Homeland Security. The researchers expressed their concerns about a proposal to use the AI to determine whether someone who is seeking refuge in the US would become a positive and contributing member of society or if they are likely to become a threat or a terrorist. The other government uses of AI are also being questioned, such as the attempts at setting bail amounts and sentences on criminals, predictive policing and hiring government workers. All of these attempts have been shown to be prone to technical issues and a limit on the data can cause bias on their decisions as they will base it on gender, race or cultural background. Other AI technologies like automated surveillance, facial recognition and mass data collection are raising concerns about privacy, security, accuracy and fairness in a democratic society. As the executive order of Trump demonstrates, there is a massive interest in harnessing AI for its full, positive potential. But the dangers of misuse, bias and abuse, whether it is intentional or not, have the chance to work against the principles of international democracies. As the use of artificial intelligence grows, the potential for misuse, bias and abuse grows as well. The purpose of this 'technology' series, is to highlight the dangers of artificial intelligence -  and to help readers make their own assessments as to  whether these innovations will do more harm than good."

Harold Levy: Publisher: The Charles Smith Blog.

----------------------------------------------------------

PASSAGE OF THE DAY: "The Chinese government has drawn wide international condemnation for its harsh crackdown on ethnic Muslims in its western region, including holding as many as a million of them in detention camps. Now, documents and interviews show that the authorities are also using a vast, secret system of advanced facial recognition technology to track and control the Uighurs, a largely Muslim minority.

-----------------------------------------------------------

STORY: "One Month, 500,000 Face Scans: How China Is Using A.I. to Profile a Minority," by reporter Paul Mozur, published by The New York Times on April 14, 2019. (Paul Mozur is a technology reporter based in Shanghai. Along with writing about Asia's biggest tech companies, he covers cybersecurity, emerging internet cultures, censorship and the intersection of geopolitics and technology in Asia. A Mandarin speaker, he was a reporter for The Wall Street Journal in China and Taiwan prior to joining The New York Times in 2014.)

GIST: "The Chinese government has drawn wide international condemnation for its harsh crackdown on ethnic Muslims in its western region, including holding as many as a million of them in detention camps. Now, documents and interviews show that the authorities are also using a vast, secret system of advanced facial recognition technology to track and control the Uighurs, a largely Muslim minority. It is the first known example of a government intentionally using artificial intelligence for racial profiling, experts said. The facial recognition technology, which is integrated into China’s rapidly expanding networks of surveillance cameras, looks exclusively for Uighurs based on their appearance and keeps records of their comings and goings for search and review. The practice makes China a pioneer in applying next-generation technology to watch its people, potentially ushering in a new era of automated racism. The technology and its use to keep tabs on China’s 11 million Uighurs were described by five people with direct knowledge of the systems, who requested anonymity because they feared retribution. The New York Times also reviewed databases used by the police, government procurement documents and advertising materials distributed by the A.I. companies that make the systems. Chinese authorities already maintain a vast surveillance net, including tracking people’s DNA, in the western region of Xinjiang, which many Uighurs call home. But the scope of the new systems, previously unreported, extends that monitoring into many other corners of the country. "............................."Yitu and its rivals have ambitions to expand overseas. Such a push could easily put ethnic profiling software in the hands of other governments, said Jonathan Frankle, an A.I. researcher at the Massachusetts Institute of Technology. “I don’t think it’s overblown to treat this as an existential threat to democracy,” Mr. Frankle said. “Once a country adopts a model in this heavy authoritarian mode, it’s using data to enforce thought and rules in a much more deep-seated fashion than might have been achievable 70 years ago in the Soviet Union. To that extent, this is an urgent crisis we are slowly sleepwalking our way into.”

The entire story can be read at:

https://www.nytimes.com/2019/04/14/technology/china-surveillance-artificial-intelligence-racial-profiling.html