Exploring the Ethical Complexities of Facial Recognition at Pizza & Prof
By Kimberly Manyanga; Photos by Myles Braxton
Content
On Thursday, April 11, 2024 at 4:00 p.m., Commonwealth Honors College hosted a captivating Pizza & Prof event featuring professor Erik Learned-Miller, the chair of the faculty in the College of Information and Computer Sciences. Held in the Honors Hub, the event drew a diverse group of approximately 35 students eager to delve into the complex landscape of facial recognition technology.
Ann Marie Russell, the Honors College associate dean of student recruitment, inclusion, and success, warmly introduced Professor Learned-Miller, highlighting his impressive academic background, which includes a PhD and a master's of science in electrical engineering and computer science from MIT, as well as a BA in psychology from Yale University.
"Pizza and Prof is a special opportunity for undergraduate students in particular to learn about a professor's research in a more intimate setting, network, and potentially think about a thesis at some point," Russell emphasized to the engaged audience.
Learned-Miller enthralled the room with his unique journey from studying theoretical problems to tackling the high-stakes applications of facial recognition. He underscored the critical importance of context and scale in evaluating the risks and benefits of this technology, shedding light on how factors like image quality, lighting conditions, and database size can significantly impact the accuracy of these systems.
One of the key nuances Learned-Miller highlighted was the concept of "gallery size," or the number of images a facial recognition system is compared against. He also raised pressing privacy concerns, noting that many individuals are unaware that their driver's license photos are being used in law enforcement databases without their explicit consent.
Challenges of Facial Recognition Technology
Throughout his talk, Learned-Miller wove in powerful anecdotes and thought-provoking questions, challenging students to grapple with the ethical implications of facial recognition. But perhaps the most striking moment of the event came when Learned-Miller shared the cautionary tale of Robert Williams, a Detroit man who was wrongfully arrested based on a flawed facial recognition match. Despite clear instructions not to rely solely on the algorithm's output, police ignored proper protocols and pulled Williams away from his family, causing immense personal and professional harm. This sobering anecdote underscored Learned-Miller's key argument: that facial recognition, like any technology, is inherently imperfect and must be managed with clear rules, rigorous oversight, and meaningful consequences for misuse.
Drawing parallels to his experience working with FDA-regulated medical devices, Professor Learned-Miller emphasized the importance of implementing safeguards commensurate with the assessed risk of each product. During the lively Q&A session, students raised insightful questions about the disproportionate impact of facial recognition errors on communities of color, the role of historical biases in photography, and the need for greater public education around these systems' limitations.
Professor Learned-Miller stressed the urgent need for legislation that puts clear guardrails around the use of facial recognition, especially in high-stakes scenarios like law enforcement. He highlighted the progress being made in states like Massachusetts, where proposed rules would require warrants, prohibit real-time surveillance, and mandate disclosure of the technology's use in arrests.
Reflecting on the Talk
Ashwini Ramesh Kumar, a second-year master's student in computer science, found the talk particularly engaging. "The talk was very relatable even if people aren't a computer science major," she noted. "There was nothing explicitly too technical, but it was still engaging. People today should know about technology, even if they're not directly working in tech."
Ramesh Kumar also emphasized the value of these interdisciplinary discussions for computer science students. "Even as computer science majors, it's important for us to get these talks outside of technical things so we can connect back to why we do what we do," she said.
As the event drew to a close, the energy in the room was electric. Students lingered long after the last slice of pizza had disappeared, continuing to engage in passionate discussions about the future of facial recognition and the role they could play in shaping it.
For many attendees, Learned-Miller's talk was a powerful reminder of the importance of interdisciplinary dialogue and critical thinking in the face of complex technological challenges. By bringing together students from diverse backgrounds and encouraging them to grapple with the ethical dimensions of facial recognition, the Honors College had created a space for meaningful growth and reflection.
For those eager to dive deeper, Professor Learned-Miller pointed to a comprehensive white paper he co-authored, titled "Facial Recognition Technologies: A Primer," which outlines a proposed regulatory framework modeled after the FDA's risk-based approach. While acknowledging that there are no easy answers, his message was one of informed optimism – that with the right guardrails in place, facial recognition could be harnessed for good while mitigating its most dangerous pitfalls.