国产三级大片在线观看-国产三级电影-国产三级电影经典在线看-国产三级电影久久久-国产三级电影免费-国产三级电影免费观看

Set as Homepage - Add to Favorites

【Happy Sex Videos in India】WhatsApp won't use Apple's child abuse image scanner

Source:Feature Flash Editor:synthesize Time:2025-07-03 04:02:17

Just because Apple has a plan — and Happy Sex Videos in Indiaa forthcoming security feature — designed to combat the spread of child sex abuse images, that doesn't mean everyone's getting on board.

WhatsApp boss Will Cathcart joined the chorus of Apple critics on Friday, stating in no uncertain terms that the Facebook-owned messaging app won't be adopting this new feature once it launches. Cathcart then went on to lay out his concerns about the machine learning-driven system in a sprawling thread.

"This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control," Cathcart wrote midway through the thread. "Countries where iPhones are sold will have different definitions on what is acceptable."


You May Also Like

While WhatsApp's position the feature itself is clear enough, Cathcart's thread focuses mostly on raising hypothetical scenarios that suggest where things could go wrong with it. He wants to know if and how the system will be used in China, and "what will happen when" spyware companies exploit it, and how error-proof it really is.

The thread amounts to an emotional appeal. It isn't terribly helpful for those who might be seeking information on why Apple's announcement raised eyebrows. Cathcart parrots some of the top-level talking points raised by critics, but the approach is more provocative than informative.

Mashable Light Speed Want more out-of-this world tech, space and science stories? Sign up for Mashable's weekly Light Speed newsletter. By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy. Thanks for signing up!

As Mashable reported on Thursday, one piece the forthcoming security update uses a proprietary technology called NeuralHash that scans each image file hash — a signature, basically — and checks it against the hashes of known Child Sex Abuse Materials (CSAM). All of this happens before a photo gets stored in iCloud Photos, and Apple isn't allowed to do or look at a thing unless the hash check sets off alarms.

The hash check approach is fallible, of course. It's not going to catch CSAM that aren't catalogued in a database, for one. Matthew Green, a cybersecurity expert and professor at Johns Hopkins University, also pointed to the possible risk of someone weaponizing a CSAM file hash inside a non-CSAM image file.

There's another piece to the security update as well. In addition to NeuralHash-powered hash checks, Apple will also introduce a parental control feature that scans images sent via iMessage to child accounts (meaning accounts that belong to minors, as designated by the account owners) for sexually explicit materials. Parents and guardians that activate the feature will be notified when Apple's content alarm trips.

SEE ALSO: Tesla channels old school sorority values by policing customers' social media posts

The Electronic Frontier Foundation (EFF) released a statement critical of the forthcoming update shortly after Apple's announcement. It's an evidence-supported takedown of the plan that offers a much clearer sense of the issues Cathcart gestures at vaguely in his thread.

There's a reasonable discussion to be had about the merits and risks of Apple's plan. Further, WhatsApp is perfectly within its rights to raise objections and commit to not making use of the feature. But you, a user who might just want to better understand this thing before you form an opinion, have better options for digging up the info you want than a Facebook executive's Twitter thread.

Start with Apple's own explanation of what's coming. The EFF response is a great place to turn next, along with some of the supporting links shared in that write-up. It's not that voices like Cathcart and even Green have nothing to add to the conversation; more than you're going to get a fuller picture if you look beyond the 280-character limits of Twitter.

Topics Apple Cybersecurity Privacy Social Media WhatsApp

0.1659s , 14342.9375 kb

Copyright © 2025 Powered by 【Happy Sex Videos in India】WhatsApp won't use Apple's child abuse image scanner,Feature Flash  

Sitemap

Top 主站蜘蛛池模板: 久久中文骚妇内射 | 日韩精品在线免费观看 | 国产区图片区小说区亚洲区 | 天天综合日韩中文字幕婚闹 | 香蕉天天人人精品综合 | 国产高清视频一区二区在线 | 特黄aa级毛片免费视频播放 | 日本一卡二卡三 | 久久久999国产精品 久久久999久久久精品 | 99久久夜色精品国产网站 | 成年美女黄网站色大片免费看 | 国产亚洲玖玖玖在线观看 | 极品少妇粉嫩小泬啪啪小说 | 国产精品大陆在线视频 | 青青草在免费线观曰本 | 久久国产人妻一区二区免费 | 18成人片黄网站WWW | 麻豆AV久久AV盛宴AV | 久久99精品久久久久久秒播 | 性一交一乱一伦在线播放 | 成人99国产精品一级毛片 | 少妇人妻精品一区二区三区 | 自拍偷拍欧美亚洲 | 亚洲国产精品国自产拍久久 | 欧美精品久久 | 国产丰满美女a级毛片 | 日本公妇里乱片A片在线播放保姆 | 国产麻豆精品精 | 国产亚洲欧美日韩综合一区 | 久久美女精品国产精品亚洲 | 国内毛片免费播放 | 欧美精品久久久久久 | 一色一伦一区二区三区 | 亚洲精品无码成人A片在线软件 | 伊人久久大香线蕉无码麻豆 | 中文乱码35页在线观看 | 国产69精品久久久久乱码免费 | 亚洲国产天堂久久麻豆 | 久久久国产精品无码一区二 | 麻豆视频免费 | 日本一本免费线观看视频 |