$type=slider$show=home$snippet=hide$cate=0$h=500$va=0$rm=0

Apple delays child abuse photo scanning planned for iOS 15

Apple announced a new Child Safety policy to automatically scan user photos for child sexual assault material (CSAM) last month, spurring ...

Apple announced a new Child Safety policy to automatically scan user photos for child sexual assault material (CSAM) last month, spurring an outcry from privacy advocates and consumers about privacy rights violations and potential government exploitation. Now Apple is delaying the rollout of the tech to solicit feedback ‘over the coming months’ before its full release.

Apple previously planned to include its CSAM-scanning tech and an accompanying optional policy to screen sexual content in iMessages for youth in iOS 15 and iPadOS 15, which are expected to launch alongside the iPhone 13 (rumored to be unveiled on September 14). It would have gone live in the US, with no stated plans for a global rollout. iHere’s Apple’s full statement on the delay, per TechCrunch:

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

Shortly after introducing the new policies in early August via a blog post, Apple followed up with a multi-page FAQ giving detailed explanations about how both the CSAM scanning and youth iMessage screening would work. 

Apple planned to use its so-called NeuralHash tech to automatically scan photos to see if they matched hashes of known CSAM material. The tech only scanned images as they were being uploaded to iCloud (which is encrypted). 

But the potential for governments to harness the automatic photo-scanning policy for their own uses had alarmed privacy advocates and industry groups – the Electronic Frontier Foundation (EFF) criticized the company for building any kind of ‘backdoor’ into user data, while the Center for Democracy and Technology (CDT) amassed a coalition decrying how such photo scanning could be abused by governments searching for objectionable material.

The CDT also laid out how another policy Apple planned to roll out alongside CSAM photo scanning – an optional feature in iMessage that blurs images with sexual content sent to users under 13 years old and notifies parents linked to the same family account – could “threaten the safety and wellbeing of some young people, and LGBTQ+ youths with unsympathetic parents are particularly at risk.”

Finally, Apple was also going to enable Siri and Search to give more helpful resources for users asking to report CSAM, as well as intervening with warnings and supportive resources when users search for CSAM-related material. It’s unclear if this will also be delayed.


Analysis: a step back for Apple, a step forward for privacy

The groups and individuals objecting to Apple’s new policy have criticized the tech giant’s methods, not its intent. In addition to opposing how it would violate user privacy and open a backdoor for government exploitation, they critiqued the potential for false positives with the CSAM scanning itself.

For instance, Apple outlined that its employees wouldn’t see any images uploaded to iCloud that had been automatically scanned unless it passed a CSAM hash threshold – in other words, that an image’s hash (a digital fingerprint of letters and numbers) found a match in a database of known CSAM. 

While hash matching is a method used by, for instance, Microsoft for its PhotoDNA tech, website security company CloudFlare, and anti-child sex trafficking nonprofit Thorn, security researchers reportedly replicated Apple’s NeuralHash code and were able to generate a ‘hash collision’ where two visibly different images were able to produce the same hash, according to TechCrunch

While we won’t know the true efficacy of Apple’s Child Safety protocols until they debut, it seems like Apple is taking the criticism and concerns seriously enough to take some months to refine its approach, meaning we may not see it roll out until the end of 2021 or 2022.



from TechRadar - All the latest technology news https://ift.tt/3DV6cSj
via IFTTT

COMMENTS

BLOGGER
Name

Apps,3855,Business,148,Camera,1154,Earn $$$,1,Gadgets,1739,Games,924,Innovations,1,Mobile,1695,Paid Promotions,3,Promotions,3,Technology,7934,Trailers,795,Travel,36,Trendly News,22765,Video,4,XIAOMI,12,
ltr
item
Trendly News | #ListenNow #Everyday #100ShortNews #TopTrendings #PopularNews #Reviews #TrendlyNews: Apple delays child abuse photo scanning planned for iOS 15
Apple delays child abuse photo scanning planned for iOS 15
Trendly News | #ListenNow #Everyday #100ShortNews #TopTrendings #PopularNews #Reviews #TrendlyNews
http://www.trendlynews.in/2021/09/apple-delays-child-abuse-photo-scanning.html
http://www.trendlynews.in/
http://www.trendlynews.in/
http://www.trendlynews.in/2021/09/apple-delays-child-abuse-photo-scanning.html
true
3372890392287038985
UTF-8
Loaded All Posts Not found any posts VIEW ALL Readmore Reply Cancel reply Delete By Home PAGES POSTS View All RECOMMENDED FOR YOU LABEL ARCHIVE SEARCH ALL POSTS Not found any post match with your request Back Home Sunday Monday Tuesday Wednesday Thursday Friday Saturday Sun Mon Tue Wed Thu Fri Sat January February March April May June July August September October November December Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec just now 1 minute ago $$1$$ minutes ago 1 hour ago $$1$$ hours ago Yesterday $$1$$ days ago $$1$$ weeks ago more than 5 weeks ago Followers Follow THIS PREMIUM CONTENT IS LOCKED STEP 1: Share. STEP 2: Click the link you shared to unlock Copy All Code Select All Code All codes were copied to your clipboard Can not copy the codes / texts, please press [CTRL]+[C] (or CMD+C with Mac) to copy