How to Stop Apple From Scanning Photos? – Guide
Cybersecurity, online security and privacy experts had mixed reactions to the ad. Users too. However, many of the arguments are clearly ignoring how widespread the practice of digitizing image databases to CSAM really is. They also ignore the fact that Apple isn’t abandoning its privacy protection technologies.
Experts explain Apple’s controversial new plan to scan your iPhone photos — and what it could mean for your privacy.
Apple has been touting its high-level user privacy standards for years, but its new plan to scan iPhone photos for child sexual abuse (CSAM) material is raising alarms in the tech world. While everyone agrees on the importance of cracking down on sexually explicit content involving children, privacy experts warn that Apple’s photo scanning can have its downsides. “This new CSAM policy can lead to many unintended consequences,” said Karim Hijazi, CEO of cybersecurity firm Prevailion. “This is just one of the many ways our privacy is being eroded on a daily basis.”
What is Apple doing?
Apple will use its CSAM detection technology for three new features in the iOS 15 update, according to the company’s official statement. The first update is a parental control feature which, when activated, checks photos in the Kids Messages app phones and send notifications if it detects explicit content. The second feature checks, flags, and reports photos in iCloud storage for containing known child sexual abuse material and third party feature notifies users when they use Siri or the search bar to search for child abuse images. Although the iPhone is considered one of the safest phones, the second feature on this list is of particular concern to privacy experts like Hijazi.
How will this new feature work?
Using a computer-based algorithm, Apple’s CSAM detection technology will scan an image in your iCloud account and compare it to known child sexual abuse image codes, which are stored in a database by child safety organizations. , including the National Center for Missing and Exploited Children (NCMEC). Upload 30 or more images that match an image from child sexual abuse databases and a human will analyze each image to see if it’s really CSAM, according to Alex Hamerstone, director of security consulting firm TrustedSec. If it is determined to be CSAM, Apple will terminate your account and report you to NCMEC and legal authorities. Parents may fear that technology will flag them for innocent photos of their children — say, a silly child covered in foam at bath time. But Apple says the system isn’t just looking for pictures of kids; is looking for those that match known and validated images of child sex in at least two databases of child safety organizations.
Why is Apple scanning photos?
Child sexual abuse material is spreading faster than ever, with millions of images or videos online and more than 19,100 victims identified by authorities. To help stem the spread of CSAM, Apple is increasing its efforts to detect and flag sexually explicit content involving children on its devices. In a statement, Apple said its new features aim “to help protect children from predators who use communication tools to recruit and exploit them, and to limit the spread of child sexual abuse material”. But experts believe this new photo-scanning technology could be one of the top security threats for smartphones.
Why are people worried?
Unfortunately, Apple’s new CSAM detection system could backfire, experts say. Many fear that Apple’s scanning of photos could lead to abuse by evildoers such as hackers, governments and even Apple itself. “Apple has basically created a bypass to its end-to-end encryption that it – or anyone who authorizes it – can use to gain access to a person’s device without their permission and spy on its content,” says Hijazi. End-to-end encryption is a security feature this ensures that no one listens to your messages. But using Apple’s new CSAM technology, someone — hackers or rogue Apple employees, for example — could access and spy on users’ photos. O feature it can also be used to search for other types of material on users’ devices, such as political content or things that are critical of a repressive government like China or Iran, according to Hamerstone. In response to these criticisms, Apple issued a statement reassuring users that it would “refuse such demands” from governments.
Are there other risks?
In addition to making users’ devices and information more vulnerable, Apple’s photo-scanning technology can confuse harmless photos with sexually explicit content – known as false positives. Apple claims the chance of a false-positive CSAM detection is one in a trillion. But Hijazi notes that researchers who recently tested a version of Apple’s CSAM scanning algorithm found errors in its ability to scan cropped and rotated images, as well as recognize differences between two photos. This finding raises concerns that the likelihood of false positives is greater than Apple suggests, according to Hijazi. “I believe that Apple heart is in the right place and is really trying to do the right thing here, but this measure is just too far-reaching because it sacrifices everyone’s privacy in the process,” he says. For your information, your cell phone phone it can also get viruses, so follow these steps to protect your iPhone.
Is there a way to cancel?
Don’t want Apple to scan photos on your device? Since Apple only scans photos uploaded to iCloud, you can opt out of the new feature disabling iCloud storage for your photos. Follow these steps to disable iCloud Photos: Alternatively, you can disable iCloud Photos by: Instead of storing your iPhone photos on Apple’s iCloud, Hamerstone suggests keeping them in a home computer or pen drive to protect your privacy. If you decide to use Apple’s iCloud despite the risks, you should know how secure iCloud really is before storing your data.
How else can I protect my privacy?
Apple CSAM Detection feature it’s not the only security threat lurking on your iPhone, according to Hijazi. “If you’re concerned about your privacy, you should assume that any iOS device is vulnerable to end-to-end encryption bypasses and full access to photos,” he says. “There is no privacy online or on any connected device, and you must always assume that is the case and behave accordingly.” Hamerstone recommends protecting your accounts with strong passwords and multi-factor authentication, using privacy tools like encryption and limiting which apps you download to your iPhone. check your phone for apps that can spy on you and learn how to permanently delete apps on an iPhone.
Final note
I hope you like the guide How to Stop Apple From Scanning Photos?. In case if you have any query regards this article you may ask us. Also, please share your love by sharing this article with your friends.