Home » Why the police didn’t use facial recognition

Why the police didn’t use facial recognition

by admin
Why the police didn’t use facial recognition

Facial recognition software was used to search for Daniela Klette. This shows that the gap between what the technology can do and what can be used legally is getting bigger and bigger.

Algorithms recognized Daniela Klette’s face in recent images.

Str/EPA DPA

She hid from the police for 30 years. A journalist found her in 30 minutes: Daniela Klette, a member of the left-wing extremist terrorist group Red Army Faction (RAF), who was allegedly involved in attacks and robberies, was arrested in her apartment in Berlin-Kreuzberg at the end of February. The police celebrated the “spectacular search success”.

A few months before that, ARD journalists worked on a podcast about the case. With the help of Canadian investigative journalist Michael Colborne, they found Burdock in a photo of a Capoeira club in Berlin. He apparently tracked her down using the facial recognition software PimEyes, supposedly in 30 minutes.

PimEyes is a type of face search engine. You can upload a photo of a person to the service’s website and receive images of the same person from publicly available sources: from galleries with party photos, websites of companies and clubs, porn sites and YouTube videos – only photos from social networks are missing. PimEyes claims to have faces of two billion people in its database.

These include many pictures from times when facial recognition didn’t work well, including Klette’s picture at the Capoira club. It is from 2017.

Today the algorithms are doing amazing things: NZZ research shows that they can find people using very blurry images, even if they are wearing sunglasses or a breathing mask and have new glasses or a different haircut. In short: the machine now recognizes people better than most people.

See also  Woman behind the German Amazon Alexa in an interview about AI and prejudice

Why the police didn’t use facial recognition

Technically the software is impressive, but it is probably not legal. No court has yet ruled on whether PimEyes broke a law when creating its software. However, the people whose photos are stored and processed in the software were never asked for their consent – they ended up in a facial recognition database without their knowledge.

For Martin Steiger, lawyer and media spokesman for the Digital Society Association, it is clear: the very act of collecting data for the PimEyes software violates data protection law. This is because biometric data such as the face is particularly well protected. He says: “The police lack the legal basis to use such an instrument.”

Florent Thouvenin, a law professor at the University of Zurich, says that explicit consent for the use of personal data is only necessary in certain cases. «But tools like PimEyes are highly problematic. They enable new forms of surveillance.”

No Chinese-style surveillance apparatus

The fact that this creates a situation in which journalists have tools at their disposal that are forbidden to the police is only absurd at first glance, says Thouvenin. If you look closely, it makes sense: “It’s about setting limits for the state when it comes to processing personal data.” Government actions, such as police operations, therefore have higher data protection requirements than when private individuals act.

This is partly because authorities have more power and data over private individuals than most private organizations: tax data, fingerprints and iris scans for passports, for example. The police are prohibited from accessing such data because otherwise they could potentially set up a surveillance apparatus, as has already happened in countries like China.

But what is forbidden to the police is also controversial for journalists. Steiger describes the use of PimEyes from Germany as “delicate”. This could be a reason why the German podcasters emphasize that it was not they themselves who found Klette’s face, but rather a researcher from Canada.

See also  The effects of ten years of startup law

PimEyes probably violates the rights of billions of people

The fact that PimEyes is a service online whose creation and use is probably illegal raises questions. It shows that technology companies are creating facts that today’s legal system cannot cope with.

The service is believed to be violating the rights of billions of people, but because none of the injured parties are sufficiently affected, no one is suing. According to the “Netzpolitik” portal, the Baden-Württemberg state data protection officer is currently conducting proceedings against PimEyes, but this has apparently not yet had any concrete impact on the company’s activities.

PimEyes has moved its headquarters from Europe to the Seychelles. This makes it more difficult for European authorities to prosecute the company for data protection violations. Nevertheless, Thouvenin believes: “A ban on tools like Pimeyes would be a clear signal, and there would probably be a consensus for it.”

In the US, authorities are buying facial recognition software

At the same time, the question arises as to how society should deal with the new technical possibilities. Authorities repeatedly use means for which there is no legal basis. Sometimes a law is passed afterwards that legalizes these means.

According to a leaked customer list, Swiss authorities have already tested Clearview’s facial recognition. This startup extracted images from Facebook, Instagram, Linkedin, etc. and turned them into a database for identity searches. Unlike PimEyes, Clearview could never simply be used online: the company sold its software specifically to American police authorities. However, the software was never officially used in Switzerland.

See also  Digital skills, the EU is slow: the objectives of the Digital Compass are at risk

In contrast, the police in several cantons, including St. Gallen, used a different program to identify faces. This does not use photos from the Internet as a basis, but “only” searches in databases and police recordings. Nevertheless, it is still controversial in this case whether there is a legal basis for this.

Basically, all new search methods are about proportionality, says Steiger: “Does the potential success of the search justify the use of means that conflict with fundamental and human rights?” He himself is of the opinion that the authorities do not need new powers to investigate well.

Where one draws the line between permissible and impermissible means is ultimately a social and political decision.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy