Apple iPhone Privacy

Apple released a fantastic new ad (above) recently, highlighting a little bit of what is happening to us all everyday—we’re sharing highly personal information with strangers.

Of course, we aren’t sharing the same way as the people in the ad. We don’t share out loud, we don’t know or see the people we’re sharing with, and, most often, we don’t even know we’re sharing our information. 

Our devices, apps and vendors are taking our data and widely distributing it without our knowledge or permission (or at least without our informed consent).

The kind of random personal information sharing in the Apple ad is nuts, but reality is 10,000% worse. 

In the real world, our data is shared not with random passersby but with companies who remember it, compile it, and derive new insights on the aggregated data and inferences they make from it. We aren’t sharing a fact here and a tidbit there, we’re streaming dozens of details every second from one to ten different devices or apps that we may be using or just be near. 

Apple made this commercial to point out that an iPhone protects you more than other devices  – which is absolutely true. The iPhone has always offered strong security and privacy features, and Apple has increased their efforts in those areas with each new phone and each new operating system release.

But Apple does not have a spotless record when it comes to protecting their consumers’ privacy. There are quite a few missed opportunities that leave their customers unnecessarily exposed, and some choices that are clearly harmful and counterproductive to their other efforts.

Below we’ll walk through what we see as the good, the ok, and the ugly of Apple’s privacy record

The Good

By current commercial market standards, Apple has done tremendous work to advance consumer privacy and protect personal data, at least in terms of stopping data from getting outside of users’ devices and Apple’s servers. Their device security, which includes strong passcodes and TouchID/FaceID delivered via their Secure Enclave, with support for the FindMy app with remote erase, all work together to keep these data-rich devices very secure. The pressure we’ve seen them get from law enforcement who cannot access data on iPhones is a testament to how difficult it is for anyone outside the device owner to get data off of Apple iPhones.

They provide users with a free and worthwhile password manager on iOS and MacOS, and they have helped normalize two-factor authentication. They’ve been pioneers in ‘differential privacy’ which enables Siri and other Apple tech to use AI and ‘big data’ from its users while meaningfully protecting individual privacy. 

Apple has led the market by being the first browser to kill third party cookies, stop cross-site tracking, limit location data sharing with apps, and much more. They added support for Content Blockers into their browsers, are integrating more direct tracker blocking in iOS14, and they gave iMessage end-to-end encryption (leaving some risk in their iCloud backup system). They also enabled an ‘Emergency Mode’ so you can quickly disable TouchID/FaceID if confronted with officials who may force you to unlock your phone even if you do not want to give them access (remember: courts have held that you cannot stop officials from using your face or fingerprint to unlock your phone but you cannot be forced to tell them your passcode.)

In each of these efforts, and many others, Apple has demonstrated not only leadership but genuine concern for personal privacy protection and standards. 

The ‘Could Be Better’

Aside from all of the above, Apple has taken steps to enable privacy or security by offering options and settings that users can selectively use to further their protection. But while they deserve praise for making these controls available, in many cases they leave the defaults in an unprotected state and require users to opt-in to privacy. 

This is unfortunate, and it would be preferable if you had to opt-in to data sharing instead of having to opt-out. When users have to opt-in to privacy, they have to understand the options, find them, and change them. We all know – and Apple knows too – that inertia leaves most settings at the factory defaults. The result of making privacy an ‘opt-in’ option, is that a lot of people share a lot of personal data that they don’t know they’re sharing and that they didn’t know they could prevent. 

iPhone’s privacy options are numerous, spread out, and, too often, aren’t clearly named or explained. Our Priiv app teaches people dozens of ways they can make their iPhone protect their privacy more than it does right out of the box. That’s a lot of time and energy people still have to spend that, frankly, is too technical for many, too time consuming, and could be made a lot simpler.

Of course, Apple serves hundreds of millions of people. Every feature they add, every option they change, makes a huge difference and they have to balance the needs of a massive number of people – it has to be acknowledged that these are not easy changes. The fact that they have moved steadily towards privacy – adding granularity to location tracking options over the last few releases for example, and offering a new ‘Precise Location’ option in iOS14 (which allows you to share a ‘fuzzy location’ and disable the sharing of your precise location) should garner commendation for their deeply thoughtful progress.

But why does Apple share a unique identifier about each user with apps and websites, unless the user enables the Limit Ad Tracking option? Why do they opt users into personalized ads from their own ad network unless the user opts out? Why do they opt users into location-based suggestions for Siri and enable ‘Significant Locations’ tracking that create detailed lists of every single place you’ve ever taken your iPhone, and bury it four levels deep so that very few people even know it’s there? 

Want to be scared? Go to Settings > Privacy > Location Sharing > System Services > Significant Locations > History. The data never leaves your phone (they say) and is used to help them improve Photo memories, Maps and other features. But is it right that you weren’t asked about this? That there is no option to delete it every 30 days?

The real answer is that Apple is – as Apple tends to do – playing the long game and moving in the right direction but doing it very slowly. In these cases, they are balancing the external disruption that a privacy-centric future will have on the current commercial framework. Apple is attempting to turn the knob slowly enough that they can both improve things for users and allow the marketers to get used to and find accommodation to the new privacy first world.

Look at what they’ve done with tracking: first, third party cookies were killed, then cross site tracking, then more powerful tracker blocking was introduced, and now finally unique IDs are being shuttered. Step-by-step, Apple has dismantled key underpinnings of the surveillance capitalism infrastructure. This impacts all businesses that advertise online, and it goes to the heart of those that make their money from those ads including news, publishing, and most websites. If Apple turned these dials in a day, instead of over five years, the impact would be traumatic for those businesses.

All to say we understand their approach, but have to be honest that if they were really putting privacy first, and doing all they could to protect their customers’ privacy, there would be a lot of default settings changed this week

The Ugly

Apple’s support for consumer privacy falls short in two areas; the way they themselves collect personal data and the options and controls they offer over that data, and how they allow apps to use embedded SDKs within apps to grab and share data without user knowledge or consent.

That Apple itself gains access to huge amounts of our personal data is not surprising. The nature of their hardware and software requires much of it. But Apple requires more than would technically be absolutely necessary, and they offer only very blunt tools to control their access. 

You can’t use an iPhone without creating an Apple account, and it’s not easy to create an account without sharing identity information with Apple. Once you have an account, they collect extensive information on how you use your phone, including every app you install and when you use them. There is no way to encrypt the data they backup to their servers. The only way to delete any of this data (except Siri searches) is to delete your entire Apple account – a rather dramatic step. 

It would be nice if there were options to delete data older than six months, for example, or opt out of more data types even if the cost were reduced functionality. There are many reasons why you might want to do this, but one is that data that exists at Apple is subject to subpoena – and even with their best intentions, data that exists is data at risk for theft or leaks, even at a well run and well-intentioned company like Apple.

To get an idea of how much data Apple has about you, visit https://privacy.apple.com/ and request a copy of the data they have. This page also includes the drastic option of asking them to delete your account and your data.

The larger and more impactful problem to most people is Apple’s willingness to look the other way regarding SDKs embedded in apps in the App Store. SDKs are components built into apps, often to provide necessary functionality from third parties – but because of the technical complexity they live in a loop-hole land which in some cases allows developers to capture personal data and share it with many other companies – without the knowledge or permission of the app user.

Location tracking is the best example; as has been exposed many times, all kinds of apps that gain location access in order to operate – maps, weather apps, shopping apps, etc – use third party SDKs to perform their location magic and the companies that provide the core location data services often gather this location data from millions of unsuspecting users and re-use or re-sell that data. The problem here is the sharing and re-sharing; it’s similar to the problem with embedded code in web pages where you visit one website and 100+ companies get your data. 

As the New York Times showed us, the implications of our location data leaks is enormous, and our apps – ones we think do one thing but are secretly doing another – are the largest source of those leaks. Apple has tightened location option controls in the past, enhances them in IOS14 and adds new warnings that are a kind of clue but not a prevention for the massive location abuse we’re all suffering.

To be fair, it would be very difficult for Apple to police this because developers need to use SDKs (the technology form is not the problem), meaning that these very small chunks of code that pass data to remote servers have no likely visible way to see or know exactly what they’re doing with that data. It could be simple and ethical processing, or it could be complex and massive redistribution. This is not unlike the problem Facebook faced with data sharing that led to the entire Cambridge Analytica mess; the core functionality had reasonable uses but when used unethically, turned into a nightmare. There is no doubt Apple faces this same nightmare right now, and it’s time for the pressure to build so they figure out how to wake us all up from it. There is no doubt a change in this area is coming, we hope it’s coming soon.

Anything Else?

Apple has created a unique position through vertical integration, making everything from the chips to the hardware to the software to the development tools to the network, cloud storage, and payment system. Which is to say they’ve already shown they’re not sitting in any box.

Two more expansions we’d love to see them make: First, Google search has to go. You cannot be privacy friendly and push zillions of user searches onto the Google platform. Every one is used against the people that make them. DuckDuckGo should be the default Apple search engine.

Secondly, Apple should build a VPN into iOS and MacOS. VPNs are the most confusing software purchase possible, their security relies almost entirely on trusting the execution of the provider which is very hard to do, and the network interactions are very difficult for others to do between Apple’s hardware and network interfaces. Plus, it’s complex software from a UX perspective and nobody has come close to nailing it for consumers. Apple could do it. VPNs protect consumers against all kinds of WiFi risks, unscrupulous (but fully legal) ISP business tactics, and various kinds of bad guys. An Apple VPN would safeguard traffic and personal data in a very fundamental and welcomed way.

The Good News

While we hate to be the ones to point out that Apple isn’t perfect, the trajectory Apple has been on suggests most of our wishes could come true in the next few years. We’re all very lucky they have a business model that doesn’t rely on them leveraging our data, or helping others to do so.

Show More
Back to top button