Differential Privacy: what it means for our data and the future of machine learning

Differential Privacy

In order to compete with its rivals, Apple has begun to bet more on artificial intelligence. Google or Facebook have no problem collecting user data and recognizing that they do it to improve their artificial intelligence and machine learning (Machine Learning) systems, but Apple does not think the same; Cupertinos do care about our privacy. For that reason, in the last WWDC they told us about Differential Privacy, your system to collect data, improve your AI, and at the same time protect our privacy.

The rest of the companies constantly want to know things like where we are, what we buy or how we use the keyboard, which includes what we are looking for, but it does not seem that this has ever worried Apple, a company that, in theory, has nothing to do with it. do with your customers' data: they do not sell advertising, only their products. Tim Cook and company offer secure devices so that users also feel safe, and that is something that Apple does not want to change.

Differential Privacy studies the general, protects the individual

Differential Privacy

As some specialists in Machine Learning And AI, the problem for Apple is that if it doesn't do something, it will be light years behind the competition when it comes to virtual assistants. This is where the Differential Privacy that we were told about in the past comes into play. WWDC. Craig Federighi explained it like this:

Differential Privacy is a research topic in the area of ​​statistics and data analysis that uses hashing algorithms, subsampling and noise injection to allow this type of learning from many sources while keeping each user's information completely private.

Differential Privacy not an apple invention. Scholars have studied the concept for years, but with the release of iOS 10, Apple will begin to use this concept to collect and analyze data from keyboard, Spotlight, and Notes users.

Differential Privacy works by algorithm coding of individual data, so that the person cannot be controlled once the data of thousands of users has been analyzed to collect large-scale trend patterns. The objective is to protect the identity of the user and the details of their data while obtaining general information that will help improve machine learning

iOS 10 It will randomly shuffle our data within our device before bulk sending it to Apple, so the data will never be sent insecurely. On the other hand, Apple will not store every word that we type with the keyboard or the searches we perform because, as I mentioned before, it does not need it. The Cupertino people say they will limit the amount of data they can collect from each user.

Apple offered the documents of its implementation of Differential Privacy to the professor Aaron Roth from the University of Pennsylvania and the professor, who arguably has written the Bible on Differential Privacy (Algorithmic Foundations of Differential Privacy), and described Apple's work in this area as "pioneering" or "ground breaking."

How Differential Privacy works

Privacy

Differential Privacy is not a unique technology. It is an approach to data processing that create restrictions to prevent data from being related to users concrete. It allows the data to be analyzed as a whole, but adds some noise to the data, which means that individual privacy does not suffer at the same time that the data is processed en masse. Adam Smith defines it as follows:

Technically it is a mathematical definition. It just restricts the different ways that data can be processed. And it restricts them in a way that doesn't allow too much information about any individual interval extraction point in the data group to be linked.

On the other hand, he compares Differential Privacy to being able to select an underlying melody behind a layer of static noise from a poorly tuned radio:

Once you understand what you are hearing, it is really easy to ignore static. So it's a bit like what happens with each individual, you don't learn too much from one individual, but overall you can see the patterns pretty clear.

Smith believes that Apple is the first major company to try to use Differential Privacy on a large scale. Other companies like AT&T have conducted studies, but have not yet dared to use it.

And the future of Artificial Intelligence?

The Silicon Valley privacy debate is often viewed through law enforcement, which balances privacy and national security. For companies, the debate is between privacy and features. What Apple has started could radically change the debate.

Google and Facebook, among others, have tried to solve the question of how to offer great products with many features that at the same time remain private. Neither Allo, Google's newest messaging app, nor Facebook Messenger offer end-to-end encryption by default because both companies need user data to improve their machine learning and allow their bots to function. Apple also wants to collect user data, but will not remove the iMessage end-to-end encryption. Smith says that Apple's implementation could cause other companies to change their minds.

In short, it seems that Apple has dared to use a system that already existed theories that will collect data from many people without violating our privacy. Will someone copy you on this?


You are interested in:
Install WhatsApp ++ on iOS 10 and without Jailbreak
Follow us on Google News

Be the first to comment

Leave a Comment

Your email address will not be published. Required fields are marked with *

*

*

  1. Responsible for the data: AB Internet Networks 2008 SL
  2. Purpose of the data: Control SPAM, comment management.
  3. Legitimation: Your consent
  4. Communication of the data: The data will not be communicated to third parties except by legal obligation.
  5. Data storage: Database hosted by Occentus Networks (EU)
  6. Rights: At any time you can limit, recover and delete your information.