Privacy means different things to different people and organizations, and no company is as invested in being different as Apple is. Now, Apple is making a dramatic change to the way it handles user data from a privacy perspective with its plan to employ a strategy known as differential privacy.
The theory behind differential privacy relies on companies collecting data about users while not being able to identify any specific user in a given group. The technique involves adding some random noise to a data set to help obscure user-specific information. Differential privacy is a relatively new field of study and Apple is one of the few companies to have said publicly that it will employ this strategy, a change that it announced at its Worldwide Developer Conference on Monday.
“Differential privacy is a research topic in the areas of statistics and data analytics that uses hashing, subsampling and noise injection to enable this kind of crowdsourced learning while keeping the data of individual users completely private. Apple has been doing some super-important work in this area to enable differential privacy to be deployed at scale,” Craig Federighi, Apple’s senior vice president of software engineering said during a keynote at the event.
“Incorporating differential privacy broadly into Apple’s technology is visionary.”
Rich Mogull, an analyst at Securosis who follows Apple security and privacy moves closely, said the adoption of differential privacy isn’t a tectonic shift for the company.
“The key difference is that differential privacy is about having mathematical proof that anonymization and privacy techniques work,” Mogull said via email.
Differential privacy employs hashing and some other techniques to help anonymize data. The practice is designed to allow companies to collect and analyze data through artificial intelligence without compromising users’ anonymity. This is contrast to services such as Google’s Allo messaging service that in some configurations uses AI to parse users’ data in plain text form. The idea of differential privacy, because it is a fresh one, hasn’t seen much in the way of real-world testing. Some security experts worry about Apple’s rapid adoption of it.
“Most people go from theory to practice, then to widespread deployment. With Differential Privacy it seems Apple cut out the middle step,” cryptographer Matthew Green said on Twitter.
Apple, like most technology companies, collects vast quantities of information about its customers, and uses it for a variety of things. The company has built a solid reputation for the security of its devices and its privacy practices, attributes that manifested themselves publicly earlier this year in Apple’s battle with the FBI over an encrypted iPhone used by a terrorist.
Differential privacy will be deployed first in iOS 10, which will be available later this year. The company said it will be used to help improve suggestions when users are typing, for example. In a quote displayed during the WWDC keynote, Aaron Roth, a professor at the University of Pennsylvania who co-wrote a paper on differential privacy in 2014, said Apple’s implementation of the idea is “visionary”.
“Incorporating differential privacy broadly into Apple’s technology is visionary and positions Apple as the clear privacy leader among technology companies today,” Roth said.
Mogull said users likely won’t notice any major differences once differential privacy is adopted, and said the system is still in its very early stages.