Society is falling asleep in a miserable nightmare of surveillance according to the tech experts and researchers he spoke to City Wire Locator.
This is in response to the growth of the biometrics and facial recognition sector, which is beginning to dramatically impact human behavior and the communities it monitors.
Ron Steinberg, an anthropologist who specializes in Xinjiang, Uyghurs and economic anthropology, has witnessed first-hand the way mass surveillance can affect people’s behavior and daily habits.
The Berlin-based anthropologist described the spread of facial recognition and surveillance as a ‘global phenomenon’ and said its potentially malicious uses have yet to be fully grasped.
Added Steinberg, who said the new way to avoid surveillance in Xinjiang is to always watch.
Always stay on top of your camera and microphone, but just make sure not to do anything that might make someone report it to you or the authorities.
After witnessing how the introduction of this technology changed the behavior of his Uyghur friend, he warned that this could soon become a reality of living for those in the West unless lawmakers enact stricter laws to limit its widespread use.
We sleepwalk into a miserable nightmare, while watching it happen right in front of us. This attitude that “it could never happen here” is troubling.
Stenberg’s warning comes from personal experience, as he was told in 2015 of the disappearance of one of his Uyghur friends while living in Xinjiang, with whom he had long been.
One of my Uyghur friends sat me down one day to tell me our friend was missing, but before he could even talk about it, he put our phones away in a different room and turned on the tap before he told me.
You can see people realize that they are being watched very early on before the system goes bust around 2017 to 2018. People become paranoid because they realize that the people they know are disappearing. Suddenly the police showed up at people’s doors knowing things they shouldn’t do.
Steinberg said he has not heard from his missing friend since his disappearance.
His warning comes after the United Nations released a report on August 31, 2022 that focused on the widespread ‘arbitrary’ detention of Uyghurs in western Xinjiang, China.
It concluded that Beijing’s actions could constitute “crimes against humanity”. Steenberg added that its scale was backed in part by monitoring technology on a large scale.
It’s not just happening in China now, it could come to us tomorrow, as it’s happening globally in different ways and to different degrees around the world.
We are looking at a global phenomenon, both in terms of labor exploitation and surveillance. For example, surveillance in Western workplaces is very sophisticated, perhaps even more advanced than in China. “Fewer cops like it, but it’s more technologically advanced,” Steinberg said.
While China’s use of surveillance is excessive, it is no exception in adopting this technology.
This is shown in a 2019 report from the Carnegie Endowment for International Peace, which highlighted that AI-enabled surveillance technology was used in at least 75 of the 176 countries it studied.
With China being the largest supplier of this technology, it has sold to 63 countries, while US companies have sold to 32 countries, according to financial times Report earlier this year.
Asset Managers Division
Although some countries have taken action to limit the adoption of surveillance firms by placing them on blacklists, asset management firms remain divided on the issue.
With some pledging not to invest in companies that pose threats to human rights, others believe participation is the best path.
Technology ethics researcher Stephanie Hare and author of Technology Is Not Neutral: A Short Guide to Technology Ethics2019 example of surveillance technology violating workers’ rights.
In this case, tell 14 people who worked at Uber Eats wired They have been improperly suspended or fired because they failed the company’s “real-time identity verification”.
This is a system that uses 1:1 face-verification technology to verify that drivers are not subcontracting shifts to people who have not passed background checks or are not qualified to work.
Hare, who presented her findings about the risks of facial recognition technology to the UK’s all-party parliamentary group on artificial intelligence, believes governments need to urgently rethink their continued implementation of surveillance systems.
In her book, Hare breaks down the different types of uses for facial recognition and categorizes them according to the risk to civil liberties and privacy.
Unlike Stenberg, Hare said there is a broad understanding of the dangers of surveillance technology among governments and the public, adding that indifference is one of the biggest obstacles.
Governments know how dangerous this can be, and the public has been told time and time again about the dangers of this type of surveillance with countless articles on the topic. But when you have societies that are fighting a lot of serious political and social issues, people find it hard to care about those things.
Just look at the Australians who are developing a very creepy monitoring system called “Capacity”. Why don’t you look at this? Is it because they are too far away and we don’t care? We need to really monitor the evolution of this sector.
US blacklist of AI surveillance companies
In June 2021, President Biden signed an executive order preventing Americans from investing in more than 40 companies, including those that enable human rights abuses against Uyghur Muslims in Xinjiang.
Among those banned is Dahua Technologies, a partially state-owned, publicly traded company that sells video surveillance products and services.
Other prohibited vital monitoring entities include: Cloudwalk Technology, Dawning Information Industry, Leon Technology Company, Megvii Technology, Netposa Technologies, SZ DJI Technology, Xiamen Meiya Pico Information, and Yitu.
For these reasons, Matthew Asia fund manager Vivek Taneiro has never invested in companies like Alibaba and Tencent. Citywire’s AAA-rated Taneiro, which manages the company’s ESG Asia Emerging Markets funds, said their policy of not investing in these companies has remained consistent.
“We are running a fund under Article 9, and so we believe that not investing in these companies is the right thing to do,” Taneiro said.
On the other hand, one of the asset managers supporting participation over exclusion is Fidelity International which has confirmed a “small holding” with Dahua in Fidelity Pacific and Fidelity Global Multi Asset Income funds.
A Fidelity spokesperson said City Wire Locator: “We have a very small ownership in Dahua. Fidelity has a long-standing and large presence in mainland China with more than a thousand employees in the country.
We actively engage with our investing companies, with this our preferred path to sustainable investing, rather than exclusion. We have found that our Chinese companies are highly responsive to our active participation and have changed our policies accordingly.
“When companies fail to improve against agreed goals or develop a pattern of deteriorating sustainability outcomes, we will review our holdings and potentially dispose of them.”
While many say there is a lot more work to be done in raising awareness of the risks associated with the sector, activists and ESG analysts have not stood idly by as civil rights concerns have been raised about the risks of adopting invasive biometric monitoring technology.
This includes Candriam, which in 2021 launched a collaborative partnership project with 51 companies on surveillance technology, as well as publishing a white paper that included risks associated with surveillance technology and facial recognition.
A 2021 report by the asset manager highlighted the risks posed by facial recognition technology and warned of its potential threats to human rights.
The report mentioned how law enforcement agencies are already deploying facial recognition technology on a large scale globally, with an estimated 1 billion surveillance cameras expected to be operational by the end of 2021.
“Today, citizens of Detroit, London, Monaco, Moscow, Beijing and other places are wandering unaware that their faces are being scanned by police-operated facial recognition systems,” the report read.
US policy analyst Isidua Oribador, who works for AccessNow, also warned in the same report of the risks to human privacy and rights posed by the gender and racial prejudices that are entrenched in these systems. As with Steenberg and Hare, the report called on lawmakers to enact stricter regulations.
“It is necessary to examine these risks and draw red lines around where the use of this technology conflicts with respect for human rights.”