If you’re a fan of Apple or use any of their products, then you’ve likely been following the recent news at Apple’s World Wide Developers Conference. One of the new “features” that’s been getting press is Apple’s announcement that they’ll begin using technology known as “Differential Privacy” with iOS 10 in order to continuously improve the iOS user experience.
Although Apple typically never shares their OS secrets, they did provide the following explanation about differential privacy:
Starting with iOS 10, Apple is using Differential Privacy technology to help discover the usage patterns of a large number of users without compromising individual privacy. To obscure an individual’s identity, Differential Privacy adds mathematical noise to a small sample of the individual’s usage pattern. As more people share the same pattern, general patterns begin to emerge, which can inform and enhance the user experience. In iOS 10, this technology will help improve QuickType and emoji suggestions, Spotlight deep link suggestions and Lookup Hints in Notes.
Obviously, like all tech companies, Apple wants to know as much as possible about its user base. However, with this announcement, Apple is taking a stance above that of its tech peers. Similar to what we’ve seen with Apple’s recent feud with the FBI regarding the unlocking of the San Bernadino shooter, Syed Farook’s, iPhone, Apple continues to position itself as the ‘Privacy Champion’ of Silicon Valley.
How does differential privacy work?
Everyone wants the best features, but they don’t want to sacrifice their privacy and personal space. So, how does Apple plan on doing this? Essentially, they’ll be collecting your usage data but will then begin “injecting” your personal data with “fake” statistics that make it supposedly impossible for them to identify you as an individual. This in turn allows Apple to compete in machine learning / artificial intelligence as trends and patterns begin to emerge from larger sample sizes. They can then apply changes to Apple software that will improve your experience with iOS devices.
Aaron Roth, a computer science professor at University of Pennsylvania explains the following about differential privacy: “With a large dataset that consists of records of individuals, you might like to run a machine learning algorithm to derive statistical insights from the database as a whole, but you want to prevent some outside observer or attacker from learning anything specific about some [individual] in the data set.”
Why should I care about differential privacy?
If you’re already using an Apple product, chances are you’ve agreed to their Terms of Service, which essentially gives the company the right to do almost anything with your data. That said, you probably will never notice any differences with the introduction of differential privacy, other than your phone might be a little bit better at predicting the next word that you’re about to type. Boo auto-correct! Hurray differential privacy! The main point to take away from this announcement is that Apple truly is taking a positive stance (from a user standpoint) on privacy and security, and is displaying that it does have its users’ best interest in mind, as opposed to some of the larger competitors (*cough* *cough* Google *cough* *cough*).
Lastly, we at Abine would point out that Blur’s Masking technology, acts somewhat like differential privacy for your online identity. Masked info replaces your real info in many places, injecting random data so that you can still use services and they can still track your behavior without easily knowing who exactly you are.
For more info on Masking: https://dnt.abine.com/#feature/masking
For more info on Differential Privacy: https://www.wired.com/2016/06/apples-differential-privacy-collecting-data/