U.S. tech giant Apple plans to roll out a remote update that will scan Americans’ iPhones for child sexual abuse images, Insider reported.
If found, Insider’s Heather Schlitz wrote citing a Thursday report from Financial Times , “Human reviewers would then alert law enforcement if they think the images are illegal.”
Security experts, however, have warned that such move could “open the floodgates to extensive surveillance.”
Matthew Green, cryptographer at Johns Hopkins University, said that the technology could be abused.
“Regardless of what Apple’s long term plans are, they’ve sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content,” Green said, according to the BBC. “Whether they turn out to be right or wrong on that point hardly matters. This will break the dam — governments will demand it from everyone.”
According to the Associated Press, Green added, “What happens when the Chinese government says, ‘Here is a list of files that we want you to scan for.’ Does Apple say no? I hope they say no, but their technology won’t say no.”
In a statement, online civil liberties rights group Electronic Frontier Foundation said that what they referred to as Apple’s apparent pivot on privacy protections is a “shocking about-face for users who have relied on the company’s leadership in privacy and security.”
John Clark, president and CEO of the National Center for Missing and Exploited Children, told the news organization that the company’s plan is a “game changer” in protecting children.
“With so many people using Apple products, these new safety measures have a lifesaving potential for children,” Clark added.
Author : Sarah Taylor