Photo credit: BigStockPhoto/ismagilov
by Gina Griffin, DSW, MSW, LCSW
We live in a time when our relationships with technology are comprehensive and complicated, and there are very few arenas left where we won’t be asked or required to interact with technology. There are new developments every day, and it can be hard to keep up. I know that it’s hard enough to keep up with family doctor appointments, client notes, and professional development on any given day. So, I’m hoping that by serving as your guide on these dips into technology, I can help you to have a slightly better idea of what’s going on out here.
Facial Recognition
Today, we’re taking a look at facial recognition. I remember being much younger and watching movies like Minority Report, and this seemed like a thing of the future. But the future is here. Facial recognition is being used in everyday tasks like unlocking your phone. It’s also used in more weighty situations, such as providing suspect matches in law enforcement, and in helping airport safety checks to proceed in a fast and orderly fashion. On the surface, this probably seems just fine. When used appropriately, it’s hard to argue against the usefulness of the technology, and many people seem to approve. For example, in a 2023 CBS 2 NY segment on facial recognition, many customers at a Fairway grocery store in Lower Manhattan approved of the use of this type of technology to help to keep the cost of groceries down by deterring shoplifting.
Unfortunately, there is usually also a downside to the use of these types of technologies. And the burden of those problems can come with a pretty high cost.
When Facial Recognition Gets It Wrong
The use of facial recognition software has already led to multiple wrongful arrests. As an example, in August of 2023, the Detroit Police Department made its third wrongful arrest based on facial recognition. The woman in question was eight months pregnant at the time, and it’s alarming to think of the negative health effects that could have resulted from her arrest and detainment. In another example, a Texas man was wrongfully arrested for armed robbery based on a facial recognition match. He was held in jail for two weeks, during which time he was sexually assaulted and beaten by other inmates. It was later proven that the accused wasn’t even in the state at the time the robbery occurred. He has been released and is suing several of the organizations involved in the fiasco.
So, it’s pretty easy to see how facial recognition technology, when it’s used in the wrong context and when it fails, can have a very negative impact on the general public and with at-risk populations that we work with as social workers.
inequity and Facial Recognition
Harvard scholar Alex Najibi details several ways in which facial recognition places marginalized communities more at risk. First, facial recognition is very poor at matching people of color. It’s at its worst when it comes to recognizing young, dark-skinned, African American females ages 18-30. In this population, the error margin is 34% higher than it is for males with light skin.
Second, communities of color are already over-policed. This means that there are more Black people arrested for minor crimes, and there are subsequently more mugshots of African Americans coming up in the databases. As a result, this system becomes part of a feedback loop in which people in these communities are more likely to be singled out because they are watched more closely than those in non-Black neighborhoods.
Third, training data sets generally include much more information on faces that are white and male. Samples of African Americans are also often of poor quality, so the systems that have been trained are resultantly very poor at identifying African American faces.
None of this is a surprise to anyone who has been keeping up with problems related to AI, facial recognition, and other advanced technologies. But it definitely bears repeating.
Encryption, Consent, and Opting Out
Most concerning is that at this point, facial data can’t be encrypted. And facial scans are easier to steal than other types of biometric data. So, while you can update your passwords, you’re not going to be able to easily alter your face if your data is stolen.
Additionally, much of the surveillance is done without proper consent of people from whom the data is gathered. It is also often completed across multiple agencies, some of those being third-party agencies that should not rightfully have access to the data.
So, is there any hope of protecting ourselves from the improper use of facial recognition surveillance? The answer is yes, but we need to be vigilant. Groups such as The Algorithmic Justice League and ISACA (Information Systems Audit and Control Association) work to address risks and inequities in these spaces. And as social workers, we can advocate for appropriate consent and transparency in the use of facial recognition technologies, properly secured data, privacy protocols, and accountability for everyone involved in using these technologies.
Additionally, activists such as Y. K. Hong remind us that we still have the option to opt out of most types of biometric data gathering, and they state that they do so every time. They state that airlines and law enforcement don’t like it when you opt out, but that there is still generally the option to use another form of ID. Of course, they also note that for already marginalized groups, this can place them more at risk, and to keep this in mind when choosing to opt out.
I hope that some of this helps you to make good choices when it comes to facial recognition. I may report more broadly on other types of biodata gathering in the near future. And I can assure you that I never voluntarily share my biodata with anyone.
Dr. Gina Griffin, DSW, MSW, LCSW, is a Licensed Clinical Social Worker. In 2012, she completed her Master of Social Work at University of South Florida. And in 2021, she completed her DSW at the University of Southern California. She began to learn R Programming for data analysis in order to develop her research-related skills. She now teaches programming and data science skills through her website (A::ISWR) and free Saturday morning #swRk workshops.