The pandemic tested the limits of facial recognition


Increasingly, it is being used in the show as a public health interest. Australia recently expanded a program using facial recognition to enforce covid-19 precautions. People in quarantine are subjected to random check-ins, where they have to send a selfie to confirm they are following the rules. Location data was also collected, according to Reuters.

When it comes to necessities like emergency benefits to pay for housing and food, the first priority should be making sure everyone is able to access help, Greer said. Preventing fraud is a reasonable goal above, he added, but the more urgent goal should be to get people the benefits they need.

“Systems must be built with human rights and with the vulnerable needs of the people in mind from the beginning. Those can’t be thoughts, ”Greer said. “They can’t be bug fixes after it’s gone wrong.”

Hall at ID.me said his company’s services are especially preferred by those with identity verification methods and have helped states cut the “majority” of unemployment fraud since enforcement checks were implemented. verify on the face. He said unemployment claims have almost a 91% real pass rate-either in person or via video call with an ID.me representative.

“[That] is our purpose to go, ”he said. “If we can automate 91% of it, then states that are not included in terms of resources can use the sources to provide white-glove concierge services at 9%.”

If users can’t get through the face recognition process, I’ll ID.me them emails to follow up, according to Hall.

“It’s all about the company that it’s about helping people access the things they deserve,” he said.

Tech in the real world

The months that JB lived we didn’t have a hard time. Financial anxiety is enough to cause stress, and other troubles like a broken computer add to the anxiety. Even their former employer can’t or won’t help cut the red tape.

“It’s especially isolated to be like,‘ Nobody’s helping me in any situation, ’” JB said.

On the government side, experts say it makes sense to bring new technology forward, but cases like JB’s show that technology itself is not the whole answer. Anne L. Washington, an assistant professor of data policy at New York University, says it’s tempting to think of a new government technology as a success if it works most of the time in the research phase but failed 5% of the time in the real world. He compared the result to a game of musical chairs, where in a room of 100 people, five were always without seats.

“The problem is that governments can get some kind of technology and make it work 95% of the time – they think it’s already solved,” he said. However, human intervention has become more important than ever. Washington said: “They need a system to always control the five people standing.”

There is an additional level of risk if a private company is involved. The most common issue has arisen with the launch of a new class of technology in which data is stored, Washington said. Without a trusted entity with a legal duty to protect people’s information, sensitive data could end up in the hands of others. For example, how would we feel if the federal government handed over our Social Security numbers to a private company when they did?

“The problem is that governments can get some kind of technology and make it work 95% of the time – they think it’s solved”

Anne L. Washington, New York University

The widespread and unexplored use of facial recognition devices also has the potential to affect already injured groups more than others. Transgender people, for example, there is detail, frequent problems with tools like Google Photos, which can make you question whether the photos before and after the transfer show the same person. This means counting the software constantly and back and forth.

“[There’s] inaccurate with the technology’s ability to reflect the breadth of real diversity and marginal cases that exist in the real world, ”said Daly Barnett, a technologist at the Electronic Frontier Foundation. “We can’t rely on them to properly classify and calculate and present beautiful cases.”

Worse than failure

Conversations about face recognition often debate how the technology can fail or be identified. But Barnett urges people to think more about whether biometric devices are available or not, or whether the bias reflects the technology. He perpetuates the idea that we need them all. In fact, activists like Greer warn, devices can be even more dangerous if they are fully functional. Facial recognition has already been used to identify, punish, or restrain protesters, even when people are fighting. In Hong Kong, protesters wore masks and goggles hide their facess from such police surveillance. In the U.S., prosecutors dropped charges against a protester identified using facial recognition accused of assaulting police officers.



Source link

admin

Leave a Reply

Your email address will not be published. Required fields are marked *