For eight years, police services have been using algorithmic detection software for video surveillance discreetly and outside any legal framework. This software of the Israeli company is already deployed in about a hundred French cities.

You will also be interested in this

(on video) Humanity shares 70% of its facial expressions The approximately seven billion humans inhabiting the Earth will have one more thing in common. Ultimately, 16…

This scandal has just been revealed by the media exposure, For eight years now, French police services have been using recognition software for video surveillance in a completely illegal way. baptism video summaryThe device in question comes from Israeli company Briefcam. l’I am aI am a The content it contains makes it possible to analyze camera images and identify conditions that are outside of normality. Not only this, it can also follow a vehicle by noting its license plate, or a targeted person. Even better, the software has even more intrusive options, including facial recognition, estimating the age, gender, and size of the target person. If it is also equipped with real-time analysis capability, its algorithms can process even hours of video surveillance very quickly in search of specific elements. If its use is outside the legal framework, the second concern is its deployment video summary Not anecdotal. Tested in the Seine-et-Marne departmental directorate of public security in 2015, the tool was gradually deployed in other departments and several specialized police units.

The AI ​​of a tool called Video Synopsis will automatically identify people and then it is possible to filter them based on criteria, for example following the person wearing the blue sweater. © BriefCam

Robber Police

In total, more than a hundred municipalities in France have this software, according to the Israeli company. The problem is that it is used video summary This is largely outside the scope of French and European law. This explains why the Ministry of the Interior is always very cautious about the use of this device. If it was experienced two years ago, exposure It turns out that no investigation has been done into its impact on data security. Similarly, CNIL appears to have never been kept in the loop, which is nevertheless a liability. So when the police exploit this software they are completely outside the law.

In France, at the regulatory level, you should be aware that there are very rare cases where facial recognition has been authorized. And yet, the State has sought to open the door to using these AI-enhanced cameras, with the so-called Olympic Games 2024 law, which was passed and then validated by the Constitutional Council last May. It authorizes the use of video surveillance until 2025. algorithmalgorithm To detect potentially risky incidents. It remains to be seen what events these may be. This law was intended to go far beyond the use of facial recognition in real time. But faced with an outcry and because the law contradicted the European Parliament’s June 14 provision on a ban on the practice, elected officials eventually abandoned the idea. But there is law and practice, and this Israeli software case shows that, ultimately, the damage has already been done. exposure “Specifies exactly how, according to a police source, the equipment will be used.” without judicial control or demand “. In other words, police services can use this software for any reason and without any prior permission. Media scrutiny has provoked a reaction from the authorities and especially from the CNIL, which has just to announce that she ” Than the Ministry of the Interior initiates a control process »After this revelation.

Categorized in: