Rite Aids AI facial recognition falsely flagged people of color

Rite Aid’s AI facial recognition falsely flagged people of color as shoplifters – The New York Times

Pharmacy chain Rite Aid used facial recognition technology to falsely and disproportionately identify people of color and women as likely shoplifters, the Federal Trade Commission said Tuesday, describing a system that embarrassed customers and raised new concerns about the biases embedded in such technologies let .

Under the terms of a settlement, Rite Aid will be prohibited from using facial recognition technology in its stores for surveillance purposes for five years, the FTC said. The agency that enforces federal consumer protection laws appeared to signal how seriously it would respond to concerns about facial recognition technology.

The FTC's 54-page complaint also sheds light on how a once theoretical concern — that human bias would spill over into artificial intelligence algorithms and increase discrimination — has become a cause for concern in the real world.

Samuel Levine, director of the FTC's Bureau of Consumer Protection, said in a statement: “Rite Aid's reckless use of facial surveillance systems has caused humiliation and other harm to its customers.”

From October 2012 to July 2020, the complaint says, Rite Aid employees followed customers in stores based on false alarms from the systems, searched them, ordered some to leave the store and, at times, called police when they refused called to confront or remove them in front of friends and family.

Rite Aid's actions disproportionately affected people of color, particularly blacks, Asians and Latinos, all under the guise of keeping “persons of interest” out of hundreds of Rite Aid stores in cities like New York, Philadelphia and Sacramento, the FTC said .

Rite Aid said in a statement that while it disagreed with the FTC's allegations, it was “pleased to reach a settlement.”

“The allegations relate to a facial recognition technology pilot program that the company implemented in a limited number of stores,” the company said. “Rite Aid stopped using the technology in this small group of stores more than three years ago, before the FTC’s investigation into the company’s use of the technology began.”

The settlement with Rite Aid comes about two months after the company filed for bankruptcy protection and announced plans to close 154 stores in more than 10 states.

Rite Aid used facial recognition technology as retail chains raised alarms about shoplifting, particularly “organized retail crime,” in which multiple people steal products from multiple stores to later sell on the black market.

Those concerns led several stores, including Rite Aid, to protect the merchandise by enclosing much of it in plastic crates.

But these worries seem to be overblown. This month, the National Retail Federation retracted its incorrect estimate that organized retail crime was responsible for nearly half of the $94.5 billion in store goods lost in 2021. Experts believe the number is probably closer to 5 percent.

Rite Aid did not tell customers that it was using the technology in its stores, and employees were “discouraged from disclosing such information,” the FTC said.

It's not clear how many other retailers are using facial recognition technology for surveillance. Macy's told Business Insider that it is using it in some stores. Home Depot says on its website that it collects “biometric information including facial recognition.”

Alvaro M. Bedoya, the FTC commissioner, said in a statement that “the clear fact that surveillance can hurt people” should not be lost in conversations about how surveillance violates rights and invades privacy.

“It has been clear for years that facial recognition systems may work less effectively on people with darker skin and women,” Bedoya said.

Woodrow Hartzog, a law professor at Boston University who has researched facial recognition technologies and the FTC, said the agency's complaint against Rite Aid shows it views AI surveillance technology as a serious threat.

The aim of the authority's complaint is significant, said Professor Hartzog. Although Rite Aid hired two unnamed companies to help create a database of people it believed might be shoplifting, the FTC only took action against Rite Aid.

The FTC, he said, is essentially saying that “the culpable conduct we are targeting is a failure to exercise due diligence when working with other providers.”

The complaint states that Rite Aid deployed the surveillance systems in urban areas and along public transportation routes, resulting in a disproportionate impact on people of color, officials said.

About 80 percent of Rite Aid stores are located in areas where whites are the largest racial or ethnic group. However, according to the FTC, 60 percent of Rite Aid stores that used facial recognition technology were located in areas where whites were not the largest racial or ethnic group

Rite Aid trained security guards at its stores to enter images into a “registration database” of people it considered “persons of interest,” and employees were told to “push for as many registrations as possible.” The databases were filled with low-quality images, many of which came from surveillance cameras, cell phone cameras and media reports, the FTC said.

Officials said this flawed system caused thousands of “false positive matches,” or alerts that incorrectly indicated that a customer matched a person in Rite Aid's database. Worse, Rite Aid had no way to track false positives, the complaint says.

“Rite Aid’s failure to adequately train or supervise employees operating facial recognition technology further increased the likelihood of harm to consumers,” the FTC said.

In one case, Rite Aid employees stopped and searched an 11-year-old girl who the system had incorrectly flagged as a person likely to shoplift.

In another example cited in the complaint, a Black man wrote to Rite Aid after falling victim to a false positive facial recognition match.

“When I walk into a store now, it's weird,” he said, adding, “Every black person is not a thief, and they shouldn't be made to feel like one.”