On Apple/Google Contact Tracing

This post is an editorial from Craig Danuloff, CEO of ThePrivacyCo. maker of Priiv, a privacy management app and publisher of this site.

Shocking myself and perhaps others, I am supportive of the broad rollout and adoption of the Apple/Google Contact Tracing technology. In fact, I think participation should be mandatory.

Some recent tweets about the Apple/Google announcement.

Those of us who care about privacy have learned to be skeptical. Over the last 20 years nearly every technology, data collection, and service that could be mis-used to abuse privacy, has been. We’re rarely given an honest accounting of what tech can do, how data is collected and used, and where it’s transferred after it’s collected.

Our first reaction to ANYTHING that collects data these days is ‘how are they going to abuse this’? When location or any other personal data is involved, the smart default stance is to stay away, opt-out, and jump-in only after the experts have confirmed that it’s safe.

So when Apple & Google say they’re going to release tools to allow ‘Contact Tracing’ for the entire population, our privacy muscles twitch hard. I have heard and ready many hearty ‘hell no’ reactions from colleagues, friends, and those I respect from afar.

But the skeptical approach can be taken too far. And often it is. Those who champion privacy rightly warn others about large and harmful threats, but they also frequently amplify tiny theoretical risks that don’t apply to very many people (and would do limited damage even when they occurred). This can cause people who could benefit from them to avoid useful tools, handy features, or services that pose no real material risk. 

An informed and rational view of risk is preferred, and circumstances must be considered too. These of course, are not normal circumstances. Today we need to reconsider the calculus of supporting or participating in what would normally be indefensibly deep, personal, invasive tracking. 

Here’s how I look at it:

  1. There is too much at stake. We’re facing a highly infectious killer virus, which transmits while people are asymptomatic, without direct or intimate contact. It’s killed thousands, touched millions, and locked away most of the world’s population for 3+ weeks and counting, which is devastating the world’s economy.
  2. Solid privacy is baked in. Apple and Google know a lot about privacy and how to implement it when they see fit. Their tech includes no location data, keeps everything on your phone unless you’re a part of a reported infection, protects personal anonymity via a random ID issued to you (new every day). We’re lucky this is happening after privacy has become such a broad public interest, or we wouldn’t even be hearing the word discussed in relation to this new tool. But today, both Apple and Google have the sensitivity and the technology to execute solid privacy and the public rightly is demanding it.
  3. There is no viable plan ‘B’. This Contact Tracking component only works if the other parts of the solution are put into place: we need mass testing for both infection and antibodies, medication for both treatment and prophylactics, and ultimately a vaccine (none of which currently exist). But with those in place and without effective wide-scale contact tracing, the curves will bubble up repeatedly and the medical and economic damage will expand again.

So considering everything, I think that people have to stay safe and society has to function again, the Apple/Google proposal exposes us to as few privacy risks as possible, and there are no obvious alternatives. 

We can’t stay at home for 12-18 months until everyone is vaccinated nor can we ‘re-open’ and have infection rates spike again if that sends everyone home for long periods. Everyone has a phone, those phones go where we go, and everyone should be in this network.

The privacy features of the Apple/Google Contact Tracing Technology.


Even given the above, my support comes with four conditions:

  1. Apple and Google, and any other institutions involved must remain committed to privacy and continue to innovate every privacy protection possible, and audit to ensure compliance.
  2. Data distribution must remain limited to those involved and approved medical research uses only. No governments or commercial enterprises may have access to or use the data – aside from rigorously selected and approved medical centers.
  3. Self-reporting a positive COVID-19 result is a recipe for disaster. We suggest approved medical centers only should have the power to report testing positive.
  4. The platform must be disassembled when this is over (as Apple/Google have now promised), likely when a vaccine has been implemented at scale. This is a temporary and one-time-only situation. 

This last item is of the most concern. History doesn’t have many examples of personal data genies climbing back into their bottles. Hopefully a ‘Privacy by Design’ architecture limits the inherent risks, but it’s unlikely that every risk has been considered. In normal times we would agree the uncertainty isn’t worth the risk. But these are not normal times. 

Surveillance Objections

The chief argument against this technology that I’ve heard is that it represents a massive and intractable advance in surveillance. But does it? Apple and Google and your network provider already know every device’s exact location at all times. If they wanted to figure out who you’ve been near, they can already do it with scary ramifications (as anyone who has ever had a Facebook friend suggestion knows).

And the entire ad-tech stack already tracks and analyzes us and our relationships in endless non-consensual ways, often with breathtakingly unfortunate accuracy and implications, with bluetooth, location, and every method they can cobble together. They did this before Contract Tracing and have operated largely in the shadows for years. The fact that this new use-case shines light on the dangers of tracking shouldn’t discourage us from using it for good in the manner proposed.   

What is new, and worth considering, is the explicit bluetooth-enabled collected list of which people you’ve been near. But that data remains on your phone unless you test positive and report (via the app) that you have been infected. In that case, the list of people you’ve been ‘near’ is shared by way of a list of anonymous identifiers (which change every day). This is how the system succeeds, informing people that they may have been near someone who soon thereafter tested positive – so they can decide to get tested or self quarantine. No parties are ever told who they were near, when, or where. There is no great central database of relationships. There is only the minimally required, and quite well concealed, data necessary to allow the system to work, and even that only exists temporarily. Extreme care has been taken to preserve and protect privacy.

But the technology has prompted several concerns in the privacy community:

  1. Bluetooth is not accurate. The fear here is that it will over-report proximity connections because in crowded cities there are people in nearby rooms or vehicles. There have also been mentions of ‘bluetooth spoofing’, which involves people capturing and re-broadcasting someone’s signal in a place where they aren’t physically present. 
  2. Locally-stored contact lists won’t stay local. There are worries they’ll be surreptitiously or accidentally uploaded eventually, providing Apple/Google or worse someone else with the master database. And similar fears of re-identification or indirect ways to determine location even thought that is expressly not included in the data set. 
  3. The core tech will never leave the stack. This is the crux of the slippery-slope argument; when any surveillance tech is added it never comes out. And with the potential for governments to gain access or apply this kind of tech, the fear grows.

Objections Overcome

I don’t believe, however, that any after close inspection of these objections justifies delay in widespread rollout and adoption of this tech. Bluetooth precision is far from perfect, and there will be some false positives in the reporting. It’s the best we have and even some small level of false positives from a single actual infection means hundreds of thousands or millions who know they are NOT at risk from that case. The lack of precision in the tech also may help ease anxiety of getting a notification – which in any case doesn’t mean you’re infected – and if the worst result is a bit of over-testing that’s a price worth paying.

The possibility of data leaks, re-identification, indirect location identification leaks, and others are real. Both mistakes and malfeasance have occurred often in the past. But perhaps no surveillance tech in history will operate under as much review and scrutiny as this, and as described thus far in all the major reviews these worries are not much more than theoretical possibilities at this point. Numerous highly technical reviewers have agreed that, based on what’s been disclosed thus far, Apple and Google appear to be getting the tech right.

Will this tech ever get turned off? As stated earlier we must make a sunset provision (and potentially a date) a requirement before it’s turned on. We need this from Apple/Google in a form and at a level we can all believe. As for governments and others, their interest and opportunity for surveillance technology isn’t expanded or created by this implementation, and if this is giving anyone new ideas, they already have them. Not moving forward with this won’t prevent anything else from happening in the future, it will just cost us the benefits the tech can provide against our current needs.

Privacy Is A Right

To be clear, any universal or even broad ability for any company or government to track interpersonal connections on any level is one I would decry and fight against. From my work over the past few years it is crystal clear to me that there are endless dangers from this type of data being gathered in either commercial or state hands.

I work everyday to help people take back control and protect their privacy, to limit and remove the ability for both corporations and governments to gather personal data. I decry the apps that leak it, or require opt-in to privacy, or surreptitiously gather and transmit location, and the increasing horror that is facial recognition. It’s clear that the large digital profiles marketers and governments are building about each of us, filled with attributes and beliefs and location histories, plus inferences and data augmentations of all kinds, present clear and specific harms and dangers to everyone.

None of that is a particularly extremely opinion these days, thankfully. There has been a dramatic privacy awakening over the past few years. And yet we all carry devices, download apps, and browse the web while we do what we can to balance the benefits of technology and our privacy.

Finding Balance

The privacy argument surrounding Contact Tracing is not one we can have on a philosophical basis. It’s one with real world implications to the health of millions and the economics of billions.

We need massively scaled and highly effective contact tracing or we will not get through or past this crisis. Along with scaled testing and breakthrough drugs and vaccines, this is our only hope. So what Apple/Google is offering critical, and it is likely far better than any others could offer since it leverages their impressive technical capabilities and massive scale, as well as the many lessons they’ve already learned about privacy.

Let’s be clear: what they’re proposing is less privacy invasive than using maps. It’s less privacy invasive than having a Facebook account. It’s less privacy invasive than many apps or websites people use everyday. It’s purpose-driven, narrowly-focused, privacy-centric, and will exist only while we need it. We should wish most apps respected privacy as much as Apple/Google contact tracing does.

Those with concerns and asking questions should keep the pressure on, identify and expose weaknesses (so that they can improve), and verify that the system is dismantled completely when this is over. But privacy enthusiasts and professionals need to help educate the public on how well this technology protects privacy, and how little specific privacy risk it imposes. Keep an eye on theoretical risks, but let’s not use them to prevent progress.

We need to be able to leverage the best of technology and avoid it’s worst risks. Apple/Google are giving us that chance. Let’s take it.

Related Articles: 

Photo by Jacek Dylag on Unsplash

Show More
Back to top button