Do ‘Contact Tracing Apps’ need a Privacy Test?
We are asking readers to contribute to this post – please comment in line or send directly to me email@example.com.
The Coronavirus continues to cause serious damage to humanity: loss of life, employment, and economic opportunity. In an effort to restart economic activity, governments at every level, local, regional, and national, have been working on a phased approach to re-opening. However, with re-opening comes a substantial risk of outbreaks (see a map of outbreaks across the world). Epidemiological studies are showing that shutdowns have been effective in preventing contagion, and recent reports from the United States indicate that some areas are reversing course back to a shutdown.
Why Contact Tracing?
One of the main strategies to support safer re-opening is the use of contact tracing apps. The World Health Organization (WHO) defines contact tracing as follows:
Contact tracing is the process of identifying, assessing, and managing people who have been exposed to a disease to prevent onward transmission. When systematically applied, contact tracing will break the chains of transmission of COVID-19 and is an essential public health tool for controlling the virus.
The Privacy Issue?
It is rather simple,
A data warehouse of sensitive personal information from multiple sources, with wide access, is a recipe for privacy failure.
A contact tracing data warehouse contains a uniquely sensitive combination of data types: location and movement, relationships between people, and medical information. This combination doesn’t exist in any other national database, which makes it a prime target for hackers, aggressive advertisers, and well-intended ignorant users.
Examples of Failures:
Two countries with advanced technologies, namely Norway and the UK, have pulled their contact tracing applications due to privacy concerns.
Norway’s health authorities said on Monday that they had suspended an app designed to help trace the spread of the new coronavirus after the national data protection agency said it was too invasive of privacy.
Launched in April, the smartphone app Smittestopp (“Infection stop”) was set up to collect location data to help authorities trace the spread of COVID-19, and inform users if they had been exposed to someone carrying the novel coronavirus.
The United Kingdom
A smartphone app to track the spread of Covid-19 may never be released, ministers admitted yesterday, as they abandoned a three-month attempt to create their own version of the technology.
The indication that the app “may never be released” suggests that the design was fundamentally incompatible with privacy. Some articles discussing the cancellation of the UK contact tracing app: times, wired.
Alberta’s COVID-19 contact-tracing app is a ‘security risk’ on Apple devices as per privacy commissioner. The report can be found here.
Why are we failing – are designers reckless?
Designers of contact tracing applications are prioritizing speedy development and data sharing over privacy. No doubt, if we need contact tracing, we need it now, and the ability to share data quickly is paramount. So how can contact tracing be reconciled with privacy?
Do we need a privacy test?
I believe that the privacy issue goes beyond testing. We need a privacy framework/charter at the national level to ensure that any contact tracing application follows a set of rules.
Are there solutions?
Absolutely. Solutions start with implementing Privacy in Design; privacy must be considered early in the application design process. Data minimization, data distribution, and anonymization are a few of the tools that can be very effective at managing privacy in a public health situation.
My book, Privacy in Design, is available free on Kindle for prime subscribers.
follow @drwhassan for more information on privacy, social media analytics, and ethics of AI computing.