Critics Say Apple Built a ‘Backdoor’ Into Your iPhone With Its New Child Abuse Detection Tools

Apple’s plans to roll out new features aimed at combating Child Sexual Abuse Material (CSAM) on its platforms have caused no small amount of controversy.

The company is basically trying to a pioneer a solution to a problem that, in recent years, has stymied law enforcement officials and technology companies alike: the large, ongoing crisis of CSAM proliferation on major internet platforms. As recently as 2018, tech firms reported the existence of as many as 45 million photos and videos that constituted child sex abuse material—a terrifyingly high number.

Yet while this crisis is very real, critics fear that Apple’s new features—which involve algorithmic scanning of users’ devices and messages—constitute a privacy violation and, more worryingly, could one day be repurposed to search for different kinds of material other than CSAM. Such a shift could open the door to new forms of widespread surveillance and serve as a potential workaround for encrypted communications—one of privacy’s last, best hopes.

Read more at Gizmodo

The Starset Society

MORE COOL STUFF LIKE THIS

IN YOUR INBOX

[mc4wp_form id=”2223″]

CONTRIBUTE

Have something to  share? Become a Starset Society Contributor today.
BECOME A CONTRIBUTOR