Ontario’s plans for contact tracing wearable devices threaten freedom, privacy
By Joe Masoodi/The Conversation
By Joe Masoodi/The Conversation
Health & Safety
In February, the Ontario government announced it had invested $2.5 million in wearable contact tracing technology to help curb the spread of coronavirus. The funds will be directed to Facedrive Inc., a Toronto-based company, to accelerate the production of its contact tracing wristbands worn by essential workers.
These wristbands are being tested and considered for wide use in long-term care homes, a First Nation community, airlines, schools and construction sites. They work by communicating with other devices through a combination of Bluetooth and Wi-Fi, sending an alert to any employee who has been in close contact with somebody who has tested positive for the virus. The wristbands also enforce social distancing by vibrating or beeping whenever they are within two metres of each other.
Contact tracing technologies
During the early stages of the pandemic, governments around the world repurposed new and pre-existing technologies in efforts to track, monitor and contain the spread of the virus. Many privacy experts and surveillance scholars feared the expansion of government and corporate surveillance, pointing out the long-term implications for privacy rights, civil liberties and democracy. Contact tracing apps were one of these innovations.
Eager for life to return to normal, a little over half (56 per cent) of Canadians revealed they were willing to use a contact tracing app. Fast forward to today, it remains unclear whether COVID Alert, Canada’s contact tracing app, has been effective in curbing the spread of the coronavirus. As for contact tracing wearable devices, the effectiveness also remains up in the air.
The allure of technologies often overshadows the immediate and long-term social, political and ethical implications that Canadians and policymakers need to be aware of.
As someone who has researched surveillance and its histories, the current government proposal is unsettling due to the places and people it targets.
Data shows that workplaces deemed essential services are characterized by highly racialized labour forces; the introduction of surveillance technologies like a wearable bracelet will disproportionately target vulnerable groups who have historically been subjected to disparate forms of surveillance and discrimination. The policy may not only fail to slow the spread of the virus but may actually perpetuate historical legacies of discrimination against vulnerable populations including racialized and low-income groups.
There is also potential for discrimination and bias as a result of the visibility of the devices. More than just an item around the wrist, the device broadcasts a personal decision: to opt-in or opt-out of contact tracing. At the same time, it may act as an identifying symbol for a particular labour force.
Before the COVID Alert app was launched in Ontario in July 2020, federal and provincial governments worked together in consultation with privacy bodies, while independent researchers and experts also offered recommendations. I, and others on our team, made recommendations to enhance COVID Alert’s privacy and security standards, many of which were implemented. Many lauded the app for its privacy-preserving and data security protections. Yet Ontario’s recent decision to invest in wearable contact tracing devices did not receive the same extensive level of public consultation, raising concerns over transparency, privacy and data security.
In a privacy white paper co-authored with the law firm McCarthy Tetrault LLP — available only upon request, making it a challenge for researchers to review — Facedrive claims to have followed Canadian privacy guidelines. But Canadian privacy laws have been largely criticized for being outdated and are not an adequate benchmark for judging a piece of technology’s threat to privacy.
Current privacy laws have been criticized for not clearly defining concepts like “personal information,” allowing tech companies to claim compliance while exploiting ambiguities in the law. Facedrive claims that their wearables “do not contain any personal information” since employee names are mapped with wearable serial numbers and stored on a centralized Microsoft Azure server. Yet the name of the employee whose test is positive and the names of those they interacted with are still available and revealed to employers on an online dashboard.
Further complicating matters, once an employer is logged in to the dashboard, not only are they able to see who has been in contact with whom, they’re also able to assess individual employee risk levels for virus exposure and manually send notifications if they suspect transmission. This means that employers are provided with what is essentially health data, while also taking up the public health role of contact tracer. This raises further questions about employee privacy rights, data security and the ethics of workplace surveillance.
Threats to democracy
Without critical examination and debate, such surveillance practices will have serious implications for civil liberties, especially the rights to privacy, freedom from discrimination and autonomy. The deployment of contact tracing wearables in workplaces will normalize surveillance and lead to its expansion — Facedrive has already indicated its interest in continuing the use of its technologies beyond the pandemic.
Wearable contact tracing devices and the data they collect can threaten our rights, freedoms and even democracy itself. It is vital for the general public, employers and policy-makers to think carefully about the limitations and implications of wearable tracking devices including the ways in which they use, collect and store data as well as how such surveillance operations will dissemble after the pandemic and not be overcome by the technological fanfare.
Joe Masoodi is a policy analyst at the Ryerson Leadership Lab at Ryerson University in Toronto. He receives funding from the Social Sciences and Humanities Research Council and the Government of Canada’s Future Skills program. His research has been funded by the Government of Canada, Department of National Defence, City of Toronto, Office of the Privacy Commissioner of Canada, and RBC. He is Policy Analyst at the Ryerson Leadership Lab and Cybersecure Policy Exchange at Ryerson University.