'500 facial recognition cameras in railway stations'

500 facial recognition cameras deployed in Indian railway stations: Report

NtechLab, which is partially funded and owned by the Russian government, lists "race" among the features its software can detect

Representative Image. Credit: AFP File Photo

In its effort to increase surveillance, the Indian Railways has deployed a network of about 500 facial recognition cameras to keep a track of commuters, according to a report by The Financial Times.

DH couldn't independently verify the report.

Developed by a Russian start-up, NtechLab, the system has been active for one month now in 30 railway stations of Gujarat and Maharashtra, the report added.

Facial recognition technology is being increasingly deployed in airports, railway stations and cafes across India, with plans for nationwide systems to modernise the police force and its information gathering and criminal identification processes.

But technology analysts and privacy experts say the benefits are not clear and could breach people's privacy or lead to greater surveillance, with little clarity on how the technology works, how the data is stored, and who can access the data.

According to the report, an independent research organisation, AI Observatory, tracked down at least 40 government-funded projects in India related to video surveillance and facial recognition. 

Chief executive of NtechLab, Andrey Telenkov, told the publication that the system can simultaneously recognise up to 50 people, including those wearing masks, and keep a count of the passenger traffic. He stressed that the technology can also be used for identifying criminals and search for missing persons.

"India is a huge market for video surveillance and the biggest worldwide. This is the entry point to one of the largest clients there... Now it is a few railway stations, but I believe they do want to expand the solution across the entire network," he said.

NtechLab's products are sold to authorities and businesses in Russia and abroad. It had supplied 1,78,000 facial recognition cameras to Moscow to help police Russia's coronavirus lockdown. Authorities say the technology has helped cut crime and enforce coronavirus lockdown restrictions.

But digital rights activists say its use is problematic without a data protection law, and that it threatens the right to privacy, which was declared to be a fundamental right by the Supreme Court in a landmark ruling in 2017.

"The police are using the technology to solve a problem without considering that this will simply become a new form of surveillance, a new form of exercising power over women," said Vidushi Marda, a researcher at human rights group Article 19.

"AI is not a silver bullet, and no amount of 'fancy' tech can fix societal problems," she said. 

(With Thomson Reuters Foundation inputs)

Get a round-up of the day's top stories in your inbox

Check out all newsletters

Get a round-up of the day's top stories in your inbox