A firm has created facial recognition software that works together with surveillance cameras to identify emotions on faces. This may allow for real time crime prevention as violent facial expressions are recognized.
Recognizing Violence In Faces
A Russian firm called NTechLab has created a software that, when used in tandem with surveillance cameras, can detect emotions and identify people who are mad, nervous, or stressed in a crowd. The software then processes the emotions it perceives in the context of the age, gender, and identity (whether known) of the people it is surveilling to choose who the potential criminals and terrorists are. final year, the firm’s software was used to power the FindFace app, which works on the Russian version of Facebook to find anyone from lost family members to suspects in cold cases.
NTechLab claims that the technology is more than 94 percent accurate. whether they’re proven right, municipalities that utilize it may be able to monitor situations in real time, stopping crime before it happens. Its clients mostly include retail businesses and security firms, but local, state/regional or even federal governments could conceivably utilize their technology.
original Technologies Preventing Crimes
Technology has already changed the way the authorities fight crime and work to prevent it. The FBI has been using the Next Generation Identification (NGI) facial recognition system, which allows the agency to parse more than 411 million photos to identify suspects — and not just the faces of people who absorb committed crimes. It also searches the visa and passport application photos of the State Department. In fact, experts estimate that approximately 117 million Americans — around half of complete adults in the U.S. — are in the database. This kind of technology has also been implemented in airports since the 9/11 terrorism attacks.
Facial recognition is also being used to boost security in other contexts: HSBC uses facial recognition software significantly than more traditional security measures, as does Lloyds in partnership with Microsoft. While this technology is primarily intended to boost online security, it is in essence also working to prevent crime and fraud.
So, can a dystopian future like the one shown in Minority Report — in which harmless people are imprisoned without ever committing actual crimes — be possible in a world that makes utilize of this technology? The FBI has responded to criticisms of its utilize of the NGI system by saying that it uses the software to generate leads — not to earn positive identifications. However, state and local law enforcement agencies also absorb access, and might absorb different policies or de facto procedures. As of October 2016, Wired reported that more than 40 civil liberties groups had requested that the Civil Rights Division of the Justice Department (now headed by Jeff Sessions) evaluate the utilize of the technology around the country and issue guidance. As yet, the matter remains unresolved.