Skip to main content
 

Part of the series “Viewpoints on Resilient and Equitable Responses to the Pandemic” from the Center for Urban and Regional Studies at The University of North Carolina at Chapel Hill.

The COVID-19 pandemic is causing people around the world to question how this virus will affect the many public and private systems that we all use. We hope this collection of viewpoints will elevate the visibility of creative state and local solutions to the underlying equity and resilience challenges that COVID-19 is highlighting and exacerbating. To do this we have asked experts at UNC to discuss effective and equitable responses to the pandemic on subjects ranging from low-wage hospitality work, retooling manufacturing processes, supply chain complications, housing, transportation, the environment, and food security, among others.

Nikhil Kaza is an associate professor in city and regional planning at UNC-Chapel Hill. He works at the intersection of urbanization patterns, local energy policy and equity. He takes an interdisciplinary approach to studying how institutional innovations help or hinder cities and how these innovations might impact sectors of society differently. Dr. Kaza will discuss the effectiveness and implications of cellphone-based surveillance and data collection systems emerging in response to the current health crisis.

 

Transcript – Viewpoints on Resilient & Equitable Responses to the Pandemic. Nikhil Kaza: Surveillance

In March 2020, the New York Times reported on a smartphone application that is being rolled out in Hangzhou, China in response to the COVID-19 pandemic. The Alipay Health Code, as China’s official news media has called the system, is a project by the local government with the help of e-commerce giant Alibaba. The app displays a color code — Green, Yellow or Red — that indicates the health status of the owner. It is not transparent how these statuses are generated. Furthermore, the app automatically reports the information on the location to the police, who then enforce restrictions such as quarantine, access to public spaces such as transit. In other cases, drones are used to identify and chastise people not wearing face masks or are not following lockdown protocols.

These surveillance methods are not unique to China. In Tiruvallur, India, police require people who are subject to quarantine orders to download an app called CoBuddy onto their smartphones. This app queries the GPS location and requires the user to upload selfies at random time of the day. Facial recognition software is used to verify that the user is indeed the quarantined person. Israel’s cabinet passed a law without parliamentary approval during an emergency sitting that authorizes Shin Bet, its security service, to mine the massive location database that it collects (the phone’s past movements) on an ongoing basis to trace potential contacts. The coronavirus economic relief bill, passed by US congress also included a $500 million for the Centers for Disease Control to build a “surveillance and data collection system”.

The use of surveillance has been credited with limiting the spread of coronavirus in Singapore and South Korea. And this epidemiological surveillance has a long history ranging from Spanish flu, Tuberculosis, SARS, MERS and AIDS. Much of this surveillance, however, requires complicated recall during focused interviews and painstaking sleuthing that interprets digital and non-digital traces to identify who the infected person is contact with, as they went about their daily activities. In this iteration of surveillance, we are abdicating such thick data collection for thin and large-scale data collection based on proximity of cell phones. We should heed the warnings of James Scott who vividly characterized authoritarian high-modernist tendency to create legibility from illegibility and its penchant to simplify complexity.

This surveillance state is built in conjunction with the surveillance economy. The advent and widespread adoption of smartphones has resulted in a platform for myriad sensors (bluetooth, microphone, GPS, radio, accelerometer to name a few). In combination with the phones’ data storage and communication capabilities, these devices have become continuous and ubiquitous surveillance devices that can soak up enormous amounts of data exhaust from users. These data exhausts have routinely been used by firms to target advertisements, reframe information from search results and reprioritize relevance of news. Shosahana Zuboff argues in Surveillance Capitalism that human experience monitored, quantified through these ubiquitous sensor networks is treated as proprietary behavioral surplus over which behavioral future markets are built. She also characterizes “dispossession by surveillance” which challenges the psychological and political bases of self-determination.

I will address the issue of the probity of this surveillance state later on, but first I want to address the issue of the effectiveness of the data. GPS traces from cellphones are notoriously unreliable for precise and persistent tracking. Location traces emitted by smartphones are best suited to identify location, on average over time, rather than a precise path – which is required for contact tracing in an epidemiological event. For example, in a recent study, Merry and Bettinger found that horizontal position recorded by smartphones is off by as much as 100m and is highly dependent on both the phone settings, environmental and built form features and time of day. Location accuracy in indoor settings is even more challenging. Recent attempts by Apple and Google to create a decentralized and automated tracing system based on Bluetooth has been criticized by security experts for potential to reveal identities, faulty signals, and the potential for trolls to render the system in useless by inundating it with false positives.

Furthermore, the accuracy of the location data is highly dependent on the make and model of the sensor and the technology used to infer the location. Since these are highly correlated with socio-economic structures and classes the burden of false inference is disproportionate on lower income people, migrants and other marginalized groups. Any policy-making that relies on precise location history of individuals to trace potential exposure is at best relying on data collection mechanisms that it is not suited for.

But, at worst, once false identification is made using unreliable data, the real social implications are enormous. Virus vigilantes in the United States and United Kingdom have resorted to anti-Asian racist abuse. Some South Koreans resorted to doxxing suspected infectious persons in social media. In China, only persons with the Green code would be allowed into grocery stores and persons with Red would be prevented from entering their own apartment complexes, lest their own phones turn Red. In India, doctors who were suspected of treating COVID patients have been evicted from their rental apartments. In some instances, social ostracism and misinformation led to suicide and inflamed community tensions. Without adequate protections and careful ethical considerations, these identifications are Scarlet Letters and Yellow badges. In addition to the enormous erosion of personal civil liberties, the economic costs are also substantial. Leaking of the faulty predictions or location histories will have implications on decisions about which locations avoid and therefore which businesses to patronize or not.

Data from these heterogeneous sensor networks is useful for some instances for cities and health authorities. Aggregated data over time, over space that points to changes in population location (rather than individual location), large scale movements and congregations might be helpful to operationalize and police physical distancing orders. Cities can also use aggregated and anonymized location information to identify accessibility changes due to widespread closures and reduction in hours. Fusing data from other sources, like increased demands on social services such as food banks and health centers can also be inferred to redirect resources. Anonymized and aggregated data can also be used to manage public spaces such as parks and beaches.

Irrespective of the technology and techniques used to collect data, we need to have a national conversation as to the purposes of the data collection and the restrictions on its uses. We need to put in place sunset clauses for these deeply invasive collection practices. We need to think carefully about how these surveillance systems will be adapted to criminal proceedings and other governmental activity, once this pandemic passes. We need to think proactively about the governance mechanisms and ownership structures and rules. We need rigorous and impartial audits of the data to ensure that false positives and faulty sensors are not creating detrimental outcomes to particular individuals and groups. We need to have accountability built into the system that takes responsibility for errors, lest we repeat the mistakes of credit reporting. In short, we need to wrap our heads about the institutional governance rather than focusing on the technological feasibility. Without proper protections, data governance, quality checks and data disposal mechanisms, the surveillance state and economy is bound to erode privacy for individuals and, in particular, for vulnerable groups. This erosion is merely an esoteric concern of the rich but a material one for the poor.

Comments are closed.