We are very close to the massification of an encryption method that remains loyal to what every user pursues: privacy. Billions of risks of data breaches are presented day by day by sharing personal or professional data. Intel, one of the global technology giants, is once again the protagonist with an initiative that will change the way Machine Learning serves its purposes.
Privacy in data processing is a reality
In a hospital, a study is performed on a patient through a remote radiology service. All information obtained is encrypted and shared with an expert radiologist. This professional performs all the necessary processing with the data still encrypted. It is not necessary to decrypt the generated data, and, once the results are obtained, they are also encrypted. Only at the time of making final reviews and delivering the diagnosis to the patient, decryption is performed. Do you think it’s possible?
Intel proposes a technology called Homomorphic Encryption . Machine Learning is gaining increasing prominence in products and services aimed at Artificial Intelligence. Consequently, these devices are powered and learned by large amounts of data. The challenge of the company, as well as the industry in general, is to be increasingly loyal to the privacy of this data.
How does this encryption method work?
To place ourselves in context, the word homomorphic is deconstructed into -homo- (same) and -morph- (form). This responds to the purpose of practically performing regular operations on encrypted data, with the sole need to decrypt them upon delivery.
The common point of the data processed by ML is that they are encrypted only when they are captured from the source. However, at the time of processing, they are decrypted. The risk of a data privacy violation and various attacks is imminent and has catastrophic consequences.
It may seem too good to be true, but Homomorphic Encryption is already 20 years old as a theory. In the last five years, there have been several advances in encrypted data processing techniques. Thus, the fact that it needs much more time to operate with encrypted, than decrypted, data has been ignored.
Although Intel is a leader in the implementation of this encryption method, it continues to give reasons to reach an agreement on what is best for data privacy in ML. Google, IBM, Microsoft and other giants have met with Intel half a year ago to discuss it. Another objective is to convert Homomorphic Encryption into some standard based on ISO, IEEE, ITU and other companies in the sector.
There are thousands of opportunities preserving the privacy of the data when operating with them. It is even possible to generate a very profitable and, above all, transparent business model, both for the part that needs the data, and for the one that owns the data. This is another opportunity for the user to perceive more and more benefits than damages , when sharing their data with other parties.