-
Wednesday, September 15, 2021

ANALYSIS/OPINION:

On Saturday, September 4th, the Financial Times headlined: “Apple delays child sex abuse detection code.” Monstrous is just one adjective that comes to mind.

If Apple executives and engineering specialists have the technology and capabilities to install a protective software that protects millions of at-risk children then, a decision to “bow to pressure for a planned launch of detection software of photos of child pornography and sex abuse on iPhones” should be investigated immediately by the Department of Justice. 


What an imprudent and dangerous decision it was by Apple executives. And what could possibly have delayed Apple’s decision unless the company stood to lose billions of dollars in sales and the wrath of special interest groups such as the American Civil Liberties Union

Apple should not be allowed to delay this urgently needed technology. In fact, there should be a DOJ investigation into Apple’s sudden reversal. Should Apple’s Senior Vice President of software engineering, Craig Federighi lose the case, the recommendation should be life imprisonment without parole and a multi-million-dollar fine. 

Moreover, all the Apple executives who conspired were complacent and complicit in preventing the new technology from being implemented into all Apple products and software have facilitated the exploitation of millions of at-risk children worldwide. They should be arrested and tried under the Trafficking Victims Protection Act (TVPA).  

Any technology that can protect millions of at-risk children from child sex trafficking, imagery, and exploitation should be MANDATORY on all Apple products and software. Just like wearing masks are suddenly mandatory despite a person’s CV19 vaccinations and a clean bill of health.

To help put this issue in its proper perspective, it is valuable to know the facts. According to Statista, Apple reported a net income of 21.74 billion US dollars in the third quarter of the fiscal year 2021. How many of these were images transmitted by Apple iPhone and other Apple products? More than 25 million child pornography images and videos were reviewed by the National Center for Missing and Exploited Children (NCMEC) in 2020. That is more than 480,769 images of children being exploited and criminally abused every year. NCMEC is a “private, nonprofit organization established in 1984 under a congressional mandate that serves as a clearinghouse for reports of child abuse.”

If these numbers don’t sufficiently scare you, then read this. Children under 12 years of age make up for 78% of images and videos analyzed by Cybertip.Ca. While the Canadian Centre for Child Protection found that “63.40% of sexually exploited children online are under the age of 8 years.” The same investigation by the Canadian Centre discovered that 80.42% of exploited children were girls and 19.58% were boys.” 

Thorn.org, another important anti-trafficking organization, found that criminals who exploit and share child pornography images and traffick children use online software and technology. They “share their content via Internet networks, forums, and different forms of internet technology, including websites, email, instant messaging/ICQ, Internet Relay Chat 9IRC), newsgroups, bulletin boards, peer-to-peer networks, internet gaming sites, social networking sites, and anonymized networks.”

On August 5, 2021, the Wall Street Journal first reported that Apple planned to “introduce new iPhone software designed to identify and report collections of sexually exploitative images of children… It will use the new techniques in cryptography and artificial intelligence to identify child sexual abuse material when it is stored using iCloud Photos.” Operative statement. “Apple will detect whether images on the device match a known database of these illegal images. If a certain number of them (number unknown) are uploaded to iCloud Photos, Apple will review the images. If they are found to be illegal, Apple will report them to the National Center for Missing and Exploited Children.”

So, what is so dangerous about that? Why are human rights groups and ACLU partners afraid of the Apple software? Are Apple executives afraid of their new technology? Will they be the predators getting caught? And why would the ACLU and any worthy human rights organization discourage protecting a child from a heinous sexually violent crime? 

Has steel in the spine completely absented itself from American corporate leaders?

• Conchita Sarnoff is the Executive Director of the Alliance to Rescue Victims of Trafficking.


Copyright © 2021 The Washington Times, LLC.