Apple Against Child Abuse: New Hash System in iCloud

Apple Against Child Abuse

Apple wants to tackle child abuse with different control systems that are integrated into the library itself. This is a really complicated matter to comment on as it can undermine the privacy of users. In this article we tell you all the details of the new tracking hash that Apple would plan to integrate into the operating system.

This is how Apple’s trace hash would work

Child abuse is undoubtedly a global problem that must be fought with all the necessary tools. Apple wants to put its grain of sand with the necessary tools to be able to search for child pornography. This is something that has been made public in different reports being unofficial, although everything indicates that it will finally be announced in the coming weeks. This system will be installed on the device client in the privacy section. A set of fingerprints representing illegal content will be downloaded in order to verify every photo in the gallery . Subsequently, a manual review will be carried out so as not to leave such an important decision in the hands of a simple algorithm.

app fotos

These techniques are not new to the ecosystem. Right now they are used in the iCloud library to be able to categorize the photos depending on their content. That is why all the review will always be done on the device and not on the company’s servers, trying to guarantee privacy and security. Although you can always prevent iCloud from storing your photos, something that would block this system. In addition, these tools can improve the machine learning functions that are already implemented offering a much more adequate general experience. Although, although all these functions are aimed at fighting child pornography, the legislation of each of the countries must be taken into account with regard to data privacy policy. The legal channels that must be followed when there is a positive coincidence in any of the photographs will also be decisive.

It is not an infallible or perfect system

Security experts have made different statements after knowing these reports, stating that this system is not infallible. In many cases, hashing algorithms can give a false positive since detecting an image as child pornography can be complex to perform due to the great diversity of images that may exist. The geopolitical implications that this system of algorithms may have should also be studied. It must be taken into account that the information will be made available to governments and that is why it can be used for other tasks such as to suppress political activism in those countries where there is not a full democracy.

Elimina objetos de tus fotogracias con Fotos

It should also be noted that currently the photos that are uploaded to iCloud for backup are not encrypted as such, but are stored in an encrypted way. Apple can always provide the keys to decrypt this content to a government and to view the entire library of a specific user, although it is something that happens in all services. But as we have mentioned previously, this is a first idea that has been collected in a report. This means that it can radically change to when it is finished being announced officially.