As if facial recognition technology isn’t controversial enough, Russian firm NTech Lab has announced plans to roll out “aggression detection” features worldwide as soon as 2021.
NTech Lab is the company responsible for creating Russia’s expansive real-time facial recognition surveillance system.
The firm, best known for the FindFace app—once dubbed the herald for the end of online privacy—has just announced that it is set to roll out “aggression detection” as well as “violence detection” features, which will flag law enforcement when the algorithm thinks someone is committing or about to commit violence.
To fund their foray into “emotional detection,” NTech has just received a $15 million cash injection from two sovereign wealth funds—one the Russian Direct Investment Fund, the other a mysterious, unnamed partner in the Middle East.
This funding will also aid the company’s plans to expand in the Middle East, Latin America, and possibly Asia, Forbes reported.
A major concern?
One of the main criticisms of facial recognition is that there is a lot of damning proof showing it to be incredibly racially biased to Black people and people of colour, often concluding that faces with a darker skin tone are more suspicious.
Tests on detecting emotion and skin tone are even more questionable…
In a study from the University of Maryland, tools from Microsoft and Chinese company Face++ were tested on photos of 400 NBA players. The Face++ AI consistently interpreted Black players as “angrier” than white players when looking at how they smiled. Microsoft’s AI determined that Black players were more “contemptuous” when facial expressions were ambiguous.
“Until facial recognition assesses black and white faces similarly, black people may need to exaggerate their positive facial expressions—essentially smile more—to reduce ambiguity and potentially negative interpretations by the technology,” study author Lauren Rhue wrote about the research.
NTech Lab’s cofounder Artem Kukharenko has said that their product needs to be “100% sure” before launch. The company later clarified that its aggression detection is currently in its “early development phase,” and so it couldn’t provide figures pertaining to its accuracy.