As cases of violence against women and girls have increased in South Asia in recent years, authorities have introduced tougher penalties and expanded surveillance networks, including facial recognition systems, to prevent such crimes.
Police in the northern Indian town of Lucknow announced earlier this year that they would install cameras with emotion recognition technology to spot harassed women, while in Pakistan, the police launched a mobile security app after a gang rape.
But the use of these technologies without evidence that they help reduce crime and without data protection laws has raised alarm bells among privacy experts and women’s rights activists who say the increased surveillance can hurt women even more.
“The police don’t even know if this technology works,” said Roop Rekha Verma, a women’s rights activist in Lucknow, Uttar Pradesh, who had the highest number of reported crimes against women. in India in 2019.
“Our experience with the police does not give us confidence that they will use technology effectively and empathetically. If not deployed properly, it can lead to even more harassment, including from the police, ”she said.
Lucknow is one of eight cities implementing a Safe City project which aims to create a “safe, secure and empowering environment” for women in public places, and to fight crime through “safer urban infrastructure. and effective access ”to the police.
But the project – alongside the 100 Smart Cities program that uses technology to improve services – is being used to exponentially increase surveillance, said Anushka Jain, associate lawyer at the Internet Freedom Foundation in Delhi.
“Authorities have used crimes against women as a justification for stepping up surveillance, but massive spending on CCTV and facial recognition technology does not correlate with a corresponding drop in crimes against women,” she said. on the phone.
“By disproportionately targeting women (the authorities) create new problems in a society where women are already constantly followed in their homes and for which anonymity in public places is so important,” she said.
Lucknow Police Commissioner DK Thakur declined to give details of how the technology will be deployed and how the data will be monitored or used.
Around the world, the rise of cloud computing and artificial intelligence technologies has popularized the use of facial recognition for a range of applications from tracking criminals to admitting spectators.
In Pakistan and India, these systems are touted as necessary to modernize understaffed police forces and facilitate their information gathering and criminal identification processes.
But tech and privacy experts say the benefits are unclear and could infringe on people’s privacy, and that without data protection laws, there is little clarity on how data are stored, who can access them and for what purpose.
The technology is also plagued with precision issues, particularly in identifying women with darker skin, non-binary people, and those from ethnic minorities.
Delhi Police in 2018 reported that their test facial recognition system had an accuracy rate of 2%. The Ministry of Women’s and Children’s Development later said the system could not accurately distinguish between boys and girls.
“We need to question the effectiveness of this solution and the reliance on digital infrastructure to solve socio-technical challenges,” said Ashali Bhandari, senior city planner at Tandem Research in Goa.
“It’s ironic that in order to protect women from unwanted attention, they need to be constantly monitored through digital technology networks. It is not the empowerment of women, but rather the idea that women should be watched for their own safety, ”she said.
At least 50 facial recognition systems are in place across India, and the government plans to roll out a nationwide network. Dozens of cities have also introduced mobile security applications.
Meanwhile, rape is reported every 15 minutes, according to government data, and crimes against women have nearly doubled to more than 405,000 cases in 2019, from around 203,000 in 2009.
Confidentiality of exchanges
There is a growing backlash in North America and Europe against the use of facial recognition technology. But in Asia, it is widely deployed.
As part of Pakistan’s Safe Cities project, thousands of CCTV cameras have been installed in Lahore, Islamabad, Karachi and Peshawar.
Camera footage in Islamabad of couples traveling in vehicles was leaked in 2019, while women at the University of Balochistan said they were blackmailed and harassed by officials with footage from CCTV cameras on campus in the same year.
Following a gang rape last year on a highway equipped with CCTV cameras, the Punjab police have launched a mobile security app that collects the user’s personal information when they send an alert to the police in case of emergency.
This includes access to phone contacts and media files – leaving women vulnerable to further harassment, say privacy groups.
“Technological interventions that aim to increase surveillance of women in order to ‘protect’ them often mimic family and societal surveillance of women,” said Shmyla Khan, director of research and policy at the Digital Rights Foundation.
“Women cannot be expected to trade their privacy for vague assurances of security without proper mechanisms and transparency from the government,” she added.
Punjabi police did not respond to a request for comment.
‘Project focused on surveillance’
The Indian cities of Chennai, Hyderabad and Delhi are among the top 10 most watched cities in the world, according to virtual private network company Surfshark.
Chennai, which tops the index with 657 CCTV cameras per square kilometer compared to Beijing at the bottom with 278, is implementing the Safe City project by mapping high crime areas and tracking buses and taxis with CCTV networks and “smart” poles.
“The government didn’t want to just do more monitoring, but to look at it more holistically to address the challenges women face at home, on their commute, at work and in public places,” said Arun Moral, director of the consultancy firm Deloitte, which advises the city on the project.
“There is a technical intervention for every challenge.”
An audit of Delhi’s Safe City project last year noted that the effectiveness of cameras in preventing crimes against women had not been studied. Only about 60% of the installed video surveillance systems were functional and less than half were monitored.
The “project strongly focused on police surveillance in Delhi needs to be reviewed,” said the audit.
Yet little progress has been made in tackling violence against women with measures such as education and increasing the number of female police officers, who represent less than 10% of the workforce, according to official data. .
“The authorities believe that technology alone can solve the problems, and the so-called solutions are little examined because they are sold on the security,” said Jain of the Internet Freedom Foundation.
“Authorities – even your own family – can cite security as a justification for increased surveillance, as security is a bigger concern than privacy,” she said.
In a time of both disinformation and too much information, quality journalism is more crucial than ever.
By subscribing you can help us make the right story.