Culture

Facial Recognition Makes Mistakes. No Big Deal, Right?

Stream: Should Big Brother Watch Us If He Would Keep Us Safe?

China like Britain is installing cameras in public places to track its citizens. Britain already has at least 1 surveillance camera for every 11 people, a fraction that is rising. China wants in on the photographic fun. The Washington Post reports:

The intent is to connect the security cameras that already scan roads, shopping malls and transport hubs with private cameras on compounds and buildings, and integrate them into one nationwide surveillance and data-sharing platform.

It will use facial recognition and artificial intelligence to analyze and understand the mountain of incoming video evidence; to track suspects, spot suspicious behaviors and even predict crime; to coordinate the work of emergency services; and to monitor the comings and goings of the country’s 1.4 billion people, official documents and security industry reports show.

Computers Make Mistakes

“Artificial intelligence” (a.k.a. “deep learning”) always sounds scary, but it is nothing more than old-fashioned statistical modeling done on a large scale. Because it is a form of modeling, it is imperfect. That means that when an algorithm designed to look at a picture of Mr. Wong and say, “This is Mr. Wong”, sometimes it won’t. Sometimes it will say it is you.

What harm could there be in that?

Consider that you have been incorrectly identified as standing outside a certain building where known troublemakers have been seen. The algorithm that said you were there then looks to the “Police Cloud” database that has “such data as criminal and medical records, travel bookings, online purchase and even social media comments.”

The computer next looks up the “meta data” from your phone records. This tells exactly where you were when you made every call, who you called and for how long, on what device you and the other party used, whether the call was followed by any data (say, a Snapchat), and so on. The only thing the computer does not admit to knowing is what you said.

The algorithm now updates your “social credit” score, lowering it. Not only does it ding your score, but the people you called also take a small hit.

The entire process is automatic, with no way to check errors, so you’ll never know why the hiring manager rejected your application. (You won’t know at Google, either.)

We’re All Guilty

There is another possibility. The facial-recognition algorithm does not make a mistake. It really was you standing there. You may have had an innocent explanation for being at that suspicious corner. But we’re talking public safety here. Why take a chance? A suspicious corner was involved. And it’s always better to be safe than sorry, isn’t it?

Here we recall the words []

Click here to read the rest. Clear your cookies after to maintain plausible deniability.

Categories: Culture

4 replies »

  1. “But, it’s only meta-data!”

    Remember what the slimy little intel establishment neocon, Michael Hayden, blurted out during a panel discussion on the USG’s vacuuming up all of the communications data in the USA, “We kill people based on meta-data.”

    https://www.tech.com.pk/we-kill-people-based-on-metadata-ex-nsa-chief/

    Don’t give these slime the benefit of the doubt.

    The never-ending War on Terror is targeted just as much at you as it is at Mohamed in the hills of Afghanistan.

    Homework: Define “terrorist…”

    Hint: https://www.aclu.org/other/how-usa-patriot-act-redefines-domestic-terrorism

    It’s just a matter of time.

  2. All this at a time when crimes against persons and property are at an all-time low — in US and UK, anyway.

    Me thinks there are other, more nefarious, reasons for the cameras — stating the obvious, of course.

  3. Until info to the contrary comes along, I’d wager that the authorities using facial recognition algorithms will not mindlessly trust the algorithm — if a criminal case is being prosecuted, or they’re trying to track down a suspect, chances are they’ll actually do some validation testing.

    Such as, look at the photo the software identified AND cross-reference that with other information such as the phone meta-data mentioned, credit card use (time & place, which is already routine), etc. Then, if they go so far as to apprehend the wrong person, some doppelganger, I bet they’d do a query on that person as well…and very likely confirm that person was elsewhere.

    After all, the authorities want to get the actual perpetrator and don’t want to look like buffoons for letting the actual criminal get away and commit more crimes…

Leave a Reply

Your email address will not be published. Required fields are marked *