Calamos Supports Greece
GreekReporter.comBusinessApple To Begin Scanning iPhones for Child Sexual Abuse Images

Apple To Begin Scanning iPhones for Child Sexual Abuse Images

iPhone child sexual abuse
Apple stated on Friday that it will begin to scan iPhones for images of child pornography. Credit: User:Calerusnak/CC BY-SA 3.0

The behemoth Apple corporation announced on Thursday that it will begin scanning iPhones in the United States for images of child sexual abuse.

The landmark decision by the computer giant drew plaudits from child protection groups but at the same time brought up the ongoing issue of privacy and concerns over the power of governments to surveil their own citizens.

The new tool in the toolbox in the fight against online child porn and the sharing of images of child sexual abuse online is called the “neuralMatch;” this scanning software will detect any such known images before they can even be uploaded to iCloud.

If it does detect such an image, the picture will then be reviewed by a person. If indeed child sexual abuse or pornography is confirmed to have taken place, the user’s account will be disabled.

There will also be an immediate referral to the National Center for Missing and Exploited Children.

Child sexual abuse problem causes Apple to walk fine line between privacy, safety

Apple officials also announced that they will scan users’ encrypted messages for sexually explicit content as another way to ensure children’s safety; this aspect of the new procedures raised the eyebrows of privacy advocates.

However, Apple stated that their new detection system will only flag images that are already in the center’s database of pictures showing child pornography.

Parents who take and share innocent photos of their child in a bathtub, therefore, appear to be in the clear. Still, some researchers say that the image matching tool, which doesn’t even “see” the images, just the mathematical “fingerprints” that represent them, could be used for sinister ends.

Matthew Green, a top cryptography researcher at Johns Hopkins University, warned that the system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornography. That could fool Apple’s algorithm and alert law enforcement. “Researchers have been able to do this pretty easily,” he said of the ability to trick such systems.

Facial recognition systems, instituted in China and other nations in the world, in order to keep tabs on their citizens, form an ominous specter of government control and surveillance for Green and other experts.

Such scans as those used to detect child sexual abuse could also be used as part of government surveillance of dissidents or even just ordinary protesters.

“Their technology won’t say no”

“What happens when the Chinese government says, ‘Here is a list of files that we want you to scan for,’” Green asked in the interview. “Does Apple say no? I hope they say no, but their technology won’t say no.”

The “face capture” process of existing facial recognition systems transforms analog information (in this case, the human face) into digital information (either data or vectors) based on the person’s facial features.

Big tech companies such as Microsoft, Google, Facebook and others have already been sharing digital fingerprints of known images of child sexual abuse.

And Apple has already used those to scan user files that are stored as part of its iCloud service for possible child sexual abuse and pornography. The Cloud is not as securely encrypted as its on-device data systems.

Admittedly, the US government has been pressuring the tech behemoth to allow increased surveillance of encrypted data.

Apple had to be careful to keep to its stated commitment to the privacy of its users while attempting to help in the ongoing campaign to wipe out the online exploitation of child sex abuse and pornography.

Some media watchdogs are not impressed with the outcome, with the Electronic Frontier Foundation, a pioneer in the field of civil liberties online, stated that Apple’s compromise in this area was “a shocking about-face for users who have relied on the company’s leadership in privacy and security.”

But others believe the development was necessary as a way to battle the scourge of child pornography while acknowledging that potential abuse remains as a possibility.

Cal Berkeley researcher backs Apple’s new technology

Hany Farid, a computer scientist who was the inventor of PhotoDNA, the very technology used by law enforcement authorities to identify online child pornography online, was hopeful, while admitting that abuse of the system could take place.

“Is it possible? Of course. But is it something that I’m concerned about? No,” said the University of California at Berkeley researcher.

Farid maintains that other programs that have been designed to make devices secure from various threats haven’t experienced “this type of mission creep.”

The popular WhatsApp software not only provides its users with complete encryption to protect their privacy, but also uses a system for detecting malware and it warns users not to click on suspicious links.

In fact, the Apple Corporation was one of the first major entities to use such “end-to-end” encryption; this scrambles messages so that they are only readable by their senders and recipients.

Law enforcement officials have for years pressured Apple for access to that very information as part of their ongoing investigations into crimes, including or child sexual abuse.

The Center for Democracy and Technology (CDT), based in Washington, DC, flatly called on Apple to abandon the plan to implement the new software, which it said destroys the company’s stated guarantee to users of employing “end-to-end encryption.” Looking through messages for any sexually explicit content in effect negates this security, it maintains.

The CDT also brought up the possible problems involved in the differentiation between truly deviant, criminal content and nudity in art or a humorous meme.

Apple insists end-to-end encryption will be maintained

Apple for its part says that its carefully-crafted software will not create a backdoor that negates encryption. To the contrary, it says that the technology actually enhances privacy.

The new technology will be part of the updates for iPhones, Macs and Apple Watches that will be going out later in 2021.

Some are applauding the new technology without reservation, including John Clark, the president and CEO of the National Center for Missing and Exploited Children. He said in a statement “Apple’s expanded protection for children is a game changer. With so many people using Apple products, these new safety measures have lifesaving potential for children.”

Officials at another nonprofit that focuses on protecting children, Thorn, founded by Demi Moore and Ashton Kutcher, believe that Apple did indeed balance the need for privacy with the urgent need to protect the nation’s children.

Julia Cordua, the CEO of Thorn, stated that the company’s technology takes into account “the need for privacy with digital safety for children.”

Apple also shared that its messaging app will soon use on-device machine learning in an effort to identify and blur sexually explicit photos on phones belonging to children. It will also warn parents of younger children of such images by way of a text message.

In addition, the new software would “intervene” if users attempt to search for topics related to child sexual abuse.

However, to receive the warnings about sexually explicit images on their children’s devices, parents must enroll their child’s phone into the program. All those over the age of 13 can unenroll as well — which of course means that after this age, parents will not receive such notifications.

Apple said in its announcement that neither of these latter features would compromise the security of private communications on its devices and it would not notify the police when these situations occur.

See all the latest news from Greece and the world at Greekreporter.com. Contact our newsroom to report an update or send your story, photos and videos. Follow GR on Google News and subscribe here to our daily email!



Related Posts