We have been hearing about the same issue for several days, however the statements of both parties seem to be in conflict. Apple has created a great unknown that affects all those who have an iPhone in their hands or who use iCloud, which involves many users in the world.
Apple’s decision to take the reins to prevent child abuse and child pornography has fallen like a cold water jug for many on the privacy side. It is a measure worthy of applause, but nevertheless, for many, this means giving free rein to the large companies, in this case Apple, accessing our images.
Complaints against Apple
It did not take long for users to appear enraged with Apple’s decision and an open letter has even been created in online format . In it, Apple is asked to modify its plans, because although the subject to be discussed is important, this measure becomes a threat to the privacy of users.
Some media have already spoken in turn that this decision could become a back door for the control of certain users . That is why more than 5,000 people have already signed the letter and the list continues to grow. The message has been clear: “If you can search for child pornography today, you can search for anything tomorrow.”
Where is the privacy limit?
Although most people are not concerned that a machine can “analyze” their images , it is a measure that already breaks with the established, since certain software is given access to our memories. This measure would bypass all the privacy systems of the mobile to take over the images in a way and intrude on privacy.
This in turn could imply that little by little we are taking steps and access to control. No longer only images or videos are concerned, but also the messages we send to try to avoid other types of problems in society at another time. Many speak exaggeratedly or not, that they are trying to carry out an exhaustive control of society against freedoms . An opinion that everyone will have to assess for themselves.
Apple defends itself and explains the mechanism
Given the great commotion that the new measures that Apple plans to implement as soon as possible are causing, the company has gone out to explain how that control of the images and the search for what should not exist will be carried out. The first lines of the document start with these statements:
We want to protect children from predators who use communication tools to recruit and exploit them, as well as limit the dissemination of child sexual abuse material.
Something that wants to make it clear that at no time will the analysis of the photos be produced for other purposes. And the first thing is that only images within iMessage will be analyzed in accounts created for children in a family unit, in addition to being an option that will have to be activated by adults.
The second measure is the one that is based on iCloud and therefore frees those who do not use it to synchronize their images. The analysis of these images will be produced through hashes , which means an intelligent database that transforms the photos into a sequence of ones and zeros. The images will only be checked if all the alerts are triggered by technology, to later be sent to the authorities.
Other companies like Google or Microsoft have been doing something similar for years , but Apple has nevertheless been selling its maximum privacy as a bargaining chip. Now this measure ends up exposing the company.