This AI tool measures social distancing. But is more surveillance worth the risk?
- Since the pandemic began, nations have been using technology in varying degrees to contain the outbreak.
- This new tool is able to place moving people on a map and estimate the distance between them.
- Some privacy advocates are raising concerns about private companies and governments installing surveillance technologies.
As COVID-19 continues to spread across the planet, some nations have been using technology to help flatten the curve.
In South Korea, for example, officials have been using GPS to track the movements of infected individuals in order to see who else might have contracted the virus. In Taiwan, the government has been enforcing quarantines through a smartphone-tracking app. And in the U.S., data scientists are exploring how they might use machine-learning to predict who’s most at risk of dying from COVID-19, and using those projections to better allocate resources.
Last week, a company called Landing AI introduced another way technology might help combat the pandemic: a tool that measures social distancing. The tool uses cameras and AI to track people’s movements, and it’s able to put their location on a bird’s-eye-view map of whatever area the camera is observing. Using these calculations, the tool estimates the distance between people.
Landing AI says businesses could use the tool to ensure employees are practicing good social distancing.
“For example, at a factory that produces protective equipment, technicians could integrate this software into their security camera systems to monitor the working environment with easy calibration steps,” the company wrote in a blog post. “As the demo shows below, the detector could highlight people whose distance is below the minimum acceptable distance in red, and draw a line between to emphasize this. The system will also be able to issue an alert to remind people to keep a safe distance if the protocol is violated.”
Landing AI isn’t the first company to develop an AI system for tracking social distancing. Additionally, some police departments have been using surveillance cameras to detect large gatherings of people, and then send officers to disperse the crowd.
Practices like these might help flatten the curve, but they also bring a unique set of threats to the public.
The dangers of normalizing surveillance
Landing AI noted that its system won’t be able to identify particular individuals.
“The rise of computer vision has opened up important questions about privacy and individual rights; our current system does not recognize individuals, and we urge anyone using such a system to do so with transparency and only with informed consent.”
Still, some privacy and workers’ advocates are concerned about introducing these kinds of systems to the workplace. In its 2019 report, New York University’s AI Now Institute wrote that using AI tools like these “pools power and control in the hands of employers and harms mainly low-wage workers.” Others have raised concerns over normalizing mass surveillance, and the potential for employers to abuse these kinds of AI systems, now or in the future.
One concerned voice is Edward Snowden, the former CIA contractor who exposed NSA surveillance programs. In a recent interview with the Danish Broadcasting Corporation, Snowden spoke about the potential problems with introducing technological surveillance measures during the pandemic.
“When we see emergency measures passed, particularly today, they tend to be sticky,” Snowden said. “The emergency tends to be expanded. Then the authorities become comfortable with some new power. They start to like it.”
One key takeaway from the Snowden interview is to be wary not necessarily of how surveillance tools might be used today, but of how they might be used years from now — we might someday find that these tools have become too integrated in our society, too normalized, to easily remove.