The hazards of facial recognition in schools

New York has banned the technology in its schools. Florida has long been opposed to gathering any kind of biometric data from students.

Advocates worry that using facial recognition in schools could disproportionately harm students of color.

Advocates worry that using facial recognition in schools could disproportionately harm students of color. Justin Sullivan via Getty Images

New York Education Commissioner Betty Rosa last week banned schools from using facial recognition technology, the culmination of a multiyear effort by privacy advocates to block the spread of the algorithm-based tools before they become an everyday fact of life in schools, like closed-circuit TV monitoring and police officers.

Advocates worry that using facial recognition in schools could disproportionately harm students of color, subject students to monitoring without consent, and lead to school officials relying on the technology for tasks it is not well-suited for.

Rosa’s decision came after lawmakers ordered a state agency to study the potential ramifications of facial recognition technology in schools. The study identified serious concerns, including potentially higher rates of “false positives for people of color, nonbinary and transgender people, women, the elderly and children,” Rosa said in her order.

Those concerns “are not outweighed by the claimed benefits” of the technology, she added, saying there was little information available on situations when facial recognition prevented violent incidents.

But Rosa decided that schools could use other kinds of biometric information, such as fingerprints or retina scans. Schools that use the alternative methods, though, must consider their impact on civil rights, effectiveness and parental input, she said.

The Florida Legislature has long been opposed to gathering any kind of biometric data from students. As far back as 2014, lawmakers passed legislation banning public schools from collecting or using such data as part of an education data privacy measure

That bill was sparked by Pinellas County schools' usage of palm scanners and related technology to charge for food in school cafeterias. The bill's sponsor "was worried that children's identifying information could be stolen out of computers and used for identity theft," the Associated Press reported. 

New York had put moratorium on facial recognition

Empire State lawmakers first put a moratorium on schools’ use of facial recognition in 2020 and specified that it remain in place until the Office of Information Technology Services released its study, which it did last month.

Some school districts had hoped to use facial recognition data to prevent school shootings. But the agency’s report noted that the technology would be unlikely to do so. That’s because it would only flag people who are not supposed to be in the school and 70% of school shooters from 1980 to 2019 were current students. The technology could also “lull administrators and staff into a false sense of security when what is really needed is face-to-face interaction with students who may be in crisis,” the technology office wrote.

Facial recognition, the New York agency concluded, “may only offer the appearance of safer schools.”

New York has been one of the states where organizers have been most successful in challenging the rollout of facial recognition technology, said Molly Kleinman, the managing director of the Science, Technology and Public Policy program at the University of Michigan.

In 2020, Kleinman was one of the authors of a study calling for the widespread ban of facial recognition technologies in schools before it became commonplace. The researchers warned the technology would exacerbate racism, normalize surveillance, change expectations of students, allow students’ data to be sold without their consent and entrench inaccurate practices.

To get an idea of what to expect with the rollout of the nascent technology, Kleinman and the other researchers looked at how previous security enhancements changed people’s behavior.

The introduction of school resource officers, stop-and-frisk policies and airport security all were designed to improve safety. Although they are supposed to be objective and neutral systems, the authors wrote, “in practice they reflect the structural and systemic biases of the societies around them. All of these practices have had racist outcomes due to the users of the systems disproportionately targeting people of color.”

“One thing that’s different from a lot of these earlier kinds of surveillance technologies is the way that the data travels so far beyond the school where it’s coming from,” Kleinman said in an interview. With closed circuit TV, the videos are usually stored on a server on campus or close to it. But facial recognition data is shared more widely and often kept indefinitely by technology companies providing the service.

Critic concerned biometric information held by companies indefinitely

“The student data in these databases is impossible to get back out again. So while they’re still children, their biometric information is held by companies they have no contact with, they have no agreement with,” she said. “And that can follow them around for the rest of their lives. That’s pretty scary stuff.”

What’s more, the systems that are designed to keep people safe often make them feel less safe, Kleinman added. Students in schools with metal detectors, for example, feel less safe than students where there are none. When schools add police officers, parents who are not in the country legally try to avoid the school, Kleinman said. “They don’t come to meet with the teachers. They don’t come to school events. And we know that parental involvement is one of the indicators of student success.”

Research has shown that facial recognition is most accurate for white males, and less so for women or people with darker skin. That adds to the potential hazards of relying on it for school functions, Kleinman said.

For example, since the pandemic started, facial recognition technology has been used more frequently with standardized testing. A research assistant Kleinman worked with took a test to apply for graduate school.

But she couldn’t retrieve her results because a facial recognition tool designed to prevent cheating concluded her photo did not match the one on her ID. “This was a young woman of color, and it took a tremendous amount of effort” for her to rectify the situation, Kleinman said.

Without new regulations, schools might start using facial recognition for routine tasks, including taking attendance, checking out library books or paying for lunch. Many smartphones already use facial recognition to unlock the screen, and it’s feasible that schools could also use that technology to have students log in to tablets or computers at school, Kleinman said.

She urged regulators and lawmakers in other states to put a check on the technology before it becomes more widespread.

“Even though we’ve seen a dramatic increase in the use of facial recognition, there are still a lot of areas where it hasn’t gotten to yet. We can be doing things to regulate it,” she said. “We found that any potential benefits of facial recognition are dramatically outweighed by the harms. And it’s never too late to regulate it.”

A version of this story was first published on Route Fifty

NEXT STORY: Should public officials be allowed to block constituents on social media?