Crypto professor doubts the effectiveness of Differential Privacy

Differential Privacy

The Artificial Intelligence of virtual assistants has been the subject of debate in recent months. The reason is that, for AI to be effective, our assistants have to compromise our privacy to a greater or lesser extent. Taking this into account, Apple has it difficult to compete with other assistants such as Google Now, but those from Cupertino presented last Monday Differential Privacy, which, in theory, you can learn about us without violating our privacy.

Apple is concerned about our privacy and he will try to respect her at all costs, even if it costs him to lag behind in something. Siri was the first virtual assistant to reach mobile devices, and now it just doesn't measure up. The reason is that other virtual assistants collect information about their users in many ways and without respecting anything and thus they are taking giant steps, but Apple already has a plan that, in theory and if it can be put into practice, will allow our virtual assistant keep up (with the possibility of outperforming the competition) without anyone having access to the data they collect from us.

Differential Privacy will make a difference in terms of privacy

Apple mentioned this feature last Monday as part of iOS 10, in passing and while talking that they had «improved security and privacy with technologies such as Differential Privacy«, But a leading expert in cryptography has questioned whether this technology is safe:

Most people go from theory to practice, then to general development. With Differential Privacy it appears that Apple has eliminated the middle step.

According to Green, Differential Privacy needs compromise our privacy to get accurate data. The question Green is asking is what kind of data, what kind of measures will apply and what Apple will do with that data.

It is a really neat idea, but the truth is that I have never seen it put into practice. It ends up being a compromise between the accuracy of the data being collected and privacy. Accuracy goes down when privacy goes up, and the exchanges I've seen have never been greater. I've never heard of anyone carry out a product like this before. So if Apple does this, they will have a custom implementation and have made all the decisions themselves.

On the other hand, the associate professor of computer science at the University of Pennsylvania, Aaron Roth labeled Differential Privacy "visionary" and that this "puts Apple as a clear leader»In terms of privacy among all technology companies in the world. The question here is: do we trust Apple? And if the answer was "no": which technology company could we trust the most to protect our privacy?


You are interested in:
Install WhatsApp ++ on iOS 10 and without Jailbreak
Follow us on Google News

Leave a Comment

Your email address will not be published. Required fields are marked with *

*

*

  1. Responsible for the data: AB Internet Networks 2008 SL
  2. Purpose of the data: Control SPAM, comment management.
  3. Legitimation: Your consent
  4. Communication of the data: The data will not be communicated to third parties except by legal obligation.
  5. Data storage: Database hosted by Occentus Networks (EU)
  6. Rights: At any time you can limit, recover and delete your information.

  1.   Alejandro said

    Pablo, there is no such company. Privacy, one way or another, is lost.

    The main time bomb (social networks) that "invite" you to create an account. This, in the medium or long term, who knows so that it can be used. Directly or indirectly, our digital footprint exists and once it is created and uploaded to the internet, forget it. There is no privacy.

  2.   Sebastian said

    The translation of the tweet is incorrect “Most people go from theory to practice, then to general development. With Differential Privacy, Apple seems to be halfway there. " the correct one is "... it seems that Apple has skipped the middle step" or that it has not performed the second step