Facial recognition software isn’t perfect

Technology has come a long way in the last few decades. Computers can do all sorts of things, from predicting where crime will happen to analyzing photos with facial recognition programs to find out exactly who is in a surveillance video.

The upside of this is that it takes some human error out of the process. In the past, an officer may have looked at a picture, compared it to a suspect, and decided they looked to him or her like the same person. But what if the officer is biased? What if they’re not properly trained? What if they’re feeling exhausted while making that call? Any number of mistakes can lead to a wrongful arrest.

However, it is dangerous to trust computers completely. There are documented cases of facial recognition software getting it wrong. It’s not perfect. We tend to think of computers as flawless machines, but they’re not. They make mistakes.

This is somewhat similar to predictive policing. Software can predict where a crime will happen. That makes it sound like it would be flawless and unbiased. However, reports indicate that the biases among the police officers are built into the algorithm and may actually be made worse by the computer. It is not nearly as reliable as it sounds.

If officers rely too much on technology, they open themselves up to these types of mistakes and errors. They may make a wrongful arrest based on these interpretations of the computer data. Those who are facing serious federal charges after such an arrest have to know about all of the legal defense options they have.