In a recent article, I recounted the story of Dana Kurtbek, who has faced harassment from the DHS and the FBI after facial recognition technology and anonymous reports placed her inside the Capitol Building during the riot on January 6th.
By her own account, she never came closer than a mile from the Capitol. She expressed concern to Mind Matters News that the continuing harassment may have resulted from neighbors who disagree with her pro-Trump views reporting her to the federal government.
Facial recognition technology and neighbors as informants may sound strange to American ears, but in China, both are essential elements of the Chinese Communist Party’s technocratic regime.
In 2014, China unveiled a social credit system that Human Rights Watch called “chilling.” The new system rewards and punishes Chinese citizens for behavior deemed good or bad by the State. It is made possible by 200 million surveillance cameras and state-of-the-art facial recognition technology featuring a database of nearly every member of China’s 1.4 billion population.
“…keeping trust is glorious and breaking trust is disgraceful,” reads the government document announcing its plans for mass surveillance.
It means keeping or breaking trust with the Chinese Communist Party (CCP).
At its most benign, the system uses public shaming to punish social reprobates. For instance, jaywalkers may find their names, pictures, and addresses posted on a public billboard or televised on the news to alert neighbors and peers to the violation of social norms.
At its most malevolent, China’s use of facial recognition technology is a tool in what has been called “the largest mass incarceration of a minority population in the world today” – a relentless and intentional targeting of the Uyghur Muslim population in the province of Xinjiang. In an interview with CBS News, VICE News correspondent Isobel Yeung called Xinjiang “the strictest state in the world right now.”
“Security cameras are absolutely everywhere,” she told CBS News, “facial recognition everywhere, voice recognition, face scanning, iris scanning, body scanning, the phone is scanned as well to check for any content that might upset the Chinese Communist Party.”
Of the 12 million Uyghurs in Xinjiang, 1.5 million have been sentenced to or are currently in one of China’s nearly 400 “re-education camps,” where sexual abuse and torture, forced sterilization, and other human rights abuses have been reported.
A letter sent to then Secretary of State Mike Pompeo from a bipartisan group of Senators in 2020 called it a “technology governance” that “harken(s) aback to troubling practices related to phrenology and eugenics.” The letter called on the Trump administration to defend human rights and a free and open internet in response to China’s use of technology and artificial intelligence.
For the vast majority of the Chinese population who are lucky enough not to be a Uyghur Muslim, social credit scores determine the relative ease or difficulty of one’s daily life. German company Bertelsmann Stiftung has created a graphic to help outsiders understand the system.
Good behavior raises a social credit score while bad behavior lowers it. Examples of good behavior include giving to charity, volunteering, helping one’s grandparents or other elders, and praising the government on social media. Bad behavior includes cheating, jaywalking and other traffic offenses, “spreading rumors” or posting anti-government messages on social media, and participating in anything the government deems “cult-like.”
A social credit score in good standing allows one privileges such as discounts, tax breaks, access to the best education, and shorter wait times in hospitals. Citizens with a low credit score, on the other hand, are publicly shamed, unable to purchase tickets for airplanes or high-speed trains, and ineligible for certain schools and jobs.
According to a report from China’s National Development and Reform Commission in July 2019, 2.56 million people had been restricted from flying, 90,000 people had been restricted from the use of high-speed trains, and 300,000 people were “deemed untrustworthy by Chinese courts.”
In 2016, a Chinese lawyer named Li Xiaolin found himself stranded 1200 miles from home after a Chinese court placed him on a “blacklist” while he was on a work trip. Due to his social credit standing, he was unable to purchase a train ticket home. His crime? After being sued by a victim of rape for his legal defense of the man accused, Li was ordered by a court to write an apology. But his apology was ruled “insincere” and he was subsequently blacklisted.
It took yet another apology letter for the court to remove Li from the travel ban list. He has to submit another apology letter in order to be granted the ability to apply for credit cards again.
Where facial recognition technology fails to identify society’s saints and reprobates, China has 850,000 “informants” paid by the state to keep detailed notebooks of the deeds and misdeeds of their neighbors.
In an in-depth analysis of what he calls the “AI-powered techno-totalitarian state,” John Manchester writes, “This is as pure a dream of a totalitarian state as there has ever been – a future in which the state knows everything and anticipates everything, acting on its citizens’ needs before the citizen is aware of having them.”
Kurtbek’s experience points to the fact that America has been flirting with policing by facial recognition for years. In 2020, UCLA introduced a facial recognition program in on-campus safety efforts. The Detroit Police Department has been using facial recognition technology to identify criminals since 2017. Chicago has been using the technology since 2013. Other jurisdictions are currently testing the waters.
More chilling still is the amount and scope of information we have voluntarily surrendered to Big Tech. We may not live in a surveillance state under a Communist government, but by surrendering our data to Facebook, Google, Apple, and other internet giants, we have subjected ourselves to living within what Shoshana Zuboff calls “surveillance capitalism.”
John Lanchester underscores the theme:
Much if not all of the technology currently developed in China already exists in the West, in forms that are just as intrusive. The difference is that the technology is almost all in the hands of private companies. AI, big data, facial recognition: Facebook, Google, Amazon, Apple and any number of smaller and emerging companies are deeply invested in these fields. Add what these companies know about you to the colossal amount of data held by the credit reference agencies, and we are as fully open to surveillance in the West as are the citizens of the People’s Republic.John Lanchester, “Document Number Nine” at London Review of Books
“Big Brother is not exactly who we expected him to be,” writes Rod Dreher in Live Not By Lies. “…He’s a salesman, he’s a broker, he’s a gatherer of raw materials, and a manufacturer of desires. He is monitoring virtually every move you make to determine how to sell you more things, and in so doing, learning how to direct your behavior.”
One has only to watch Netflix’s The Social Dilemma to understand just how watched we are by Big Tech.
These surveillance state inroads in America have met some resistance. Pushback from students at the UCLA forced the school to abandon its policy. The use of facial recognition technology in law enforcement has been fiercely opposed, largely on the evidence that the technology disproportionately misidentifies women and minorities. Portland and San Francisco are two of many cities that have banned facial recognition technology.
Most recently, the ACLU called on President Biden on Wednesday “to pause all federal government use of face recognition technology” on the grounds of biased programming and civil liberties violations. They are circulating a petition, hoping that broad support will garner the attention of the Biden administration.
Since its rollout in 2014, China’s social credit system has largely been run by local jurisdictions and private companies (in close partnership with the CCP), which means rules and standards vary by region. A fully-functioning, centralized system was scheduled to begin in 2020, but has now been delayed due to the impact of COVID-19.
You may also wish to read:
False identification: One woman’s facial recognition nightmare Government security officials and the ACLU stay mum, despite repeated requests for information, involvement in the Pennsylvania woman’s case. “I didn’t really believe him,” she said. If the DHS officer’s claim was true, “it’s the worst facial recognition software on the planet because I wasn’t there.” (Caitlin Bassett)
Leaked police database: Toal surveillance of China’s Uyghurs Human Rights Watch notes that many countries engage in human rights abuses, but “more than any other government, Beijing has made technology central to its repression.” (Heather Zeiger)