{"id":7136,"date":"2021-08-12T13:36:50","date_gmt":"2021-08-12T13:36:50","guid":{"rendered":"http:\/\/TheNextWeb=1363823"},"modified":"2021-08-12T13:36:50","modified_gmt":"2021-08-12T13:36:50","slug":"an-examination-of-apples-plans-to-scan-your-iphone-photos-for-abusive-content","status":"publish","type":"post","link":"https:\/\/www.londonchiropracter.com\/?p=7136","title":{"rendered":"An examination of Apple\u2019s plans to \u2018scan\u2019 your iPhone photos for abusive content"},"content":{"rendered":"\n<div><img decoding=\"async\" src=\"https:\/\/img-cdn.tnwcdn.com\/image\/tnw?filter_last=1&amp;fit=1280%2C640&amp;url=https%3A%2F%2Fcdn0.tnwcdn.com%2Fwp-content%2Fblogs.dir%2F1%2Ffiles%2F2021%2F08%2Fphone-1537387_1280.jpg&amp;signature=2148615cd41eb1d8585e242ae1fcfe84\" class=\"ff-og-image-inserted\"><\/div>\n<p>The proliferation of <a href=\"https:\/\/www.nytimes.com\/interactive\/2019\/09\/28\/us\/child-sex-abuse.html\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">child sexual abuse material<\/a> on the internet is harrowing and sobering. Technology companies send <a href=\"https:\/\/www.missingkids.org\/content\/dam\/missingkids\/gethelp\/2020-reports-by-esp.pdf\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">tens of millions of reports per year<\/a> of these images to the nonprofit <a href=\"https:\/\/www.missingkids.org\/theissues\/csam\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">National Center for Missing and Exploited Children<\/a>.<\/p>\n<p>The way companies that provide cloud storage for your images usually detect child abuse material leaves you vulnerable to privacy violations by the companies \u2013 and hackers who break into their computers. On Aug. 5, 2021, Apple <a href=\"https:\/\/www.apple.com\/child-safety\/\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">announced a new way to detect this material<\/a> that promises to better protect your privacy.<\/p>\n<p>As a <a href=\"https:\/\/scholar.google.com\/citations?user=lneZSfIAAAAJ\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">computer scientist<\/a> who studies cryptography, I can explain how Apple\u2019s system works, why it\u2019s an improvement, and why Apple needs to do more.<\/p>\n<h2>Who holds the key?<\/h2>\n<p>Digital files can be protected in a sort of virtual lockbox via encryption, which garbles a file so that it can be revealed, or decrypted, only by someone holding a secret key. Encryption is one of the best tools for protecting personal information as it traverses the internet.<\/p>\n<p>Can a cloud service provider detect child abuse material if the photos are garbled using encryption? It depends on who holds the secret key.<\/p>\n<p>Many cloud providers, including Apple, keep a copy of the secret key so they can assist you in <a href=\"https:\/\/support.apple.com\/en-us\/HT201487\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">data recovery<\/a> if you forget your password. With the key, <a href=\"https:\/\/www.macobserver.com\/analysis\/apple-scans-uploaded-content\/\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">the provider can also match<\/a> photos stored on the cloud against known child abuse images held by the National Center for Missing and Exploited Children.<\/p>\n<p>But this convenience comes at a big cost. A cloud provider that stores secret keys might <a href=\"https:\/\/www.vice.com\/en\/article\/g5gk73\/google-fired-dozens-for-data-misuse\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">abuse its access<\/a><a href=\"https:\/\/www.telegraph.co.uk\/news\/2021\/07\/12\/exclusive-extract-facebooks-engineers-spied-women\/\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">to your data<\/a> or fall prey to a <a href=\"https:\/\/epic.org\/privacy\/data-breach\/equifax\/\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">data breach<\/a>.<\/p>\n<p>A better approach to online safety is <a href=\"https:\/\/ssd.eff.org\/en\/glossary\/end-end-encryption\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">end-to-end encryption<\/a>, in which the secret key is stored only on your own computer, phone, or tablet. In this case, the provider cannot decrypt your photos. Apple\u2019s answer to checking for child abuse material that\u2019s protected by end-to-end encryption is a new procedure in which the cloud service provider, meaning Apple, and your device perform the image matching together.<\/p>\n<h2>Spotting evidence without looking at it<\/h2>\n<p>Though that might sound like magic, with modern cryptography it\u2019s actually possible to work with data that you cannot see. I have contributed to projects that use cryptography to <a href=\"https:\/\/thebwwc.org\/\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">measure the gender wage gap<\/a><a href=\"https:\/\/www.usenix.org\/system\/files\/soups2019-qin.pdf\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">without learning anyone\u2019s salary<\/a>&nbsp;and to <a href=\"https:\/\/www.mycallisto.org\/\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">detect repeat offenders of sexual assault<\/a><a href=\"https:\/\/static1.squarespace.com\/static\/5ff5d891409193661a0718c0\/t\/604134db3f35b3501dabfa4a\/1614886107693\/callisto-cryptographic-approach.pdf\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">without reading any victim\u2019s report<\/a>. And there are <a href=\"https:\/\/drive.google.com\/file\/d\/1NT_vdxRC8YEPlkQa2KHw22ai9IshyU73\/view\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">many more examples<\/a> of companies and governments using cryptographically protected computing to provide services while safeguarding the underlying data.<\/p>\n<p><a href=\"https:\/\/www.apple.com\/child-safety\/pdf\/Apple_PSI_System_Security_Protocol_and_Analysis.pdf\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">Apple\u2019s proposed image matching<\/a> on iCloud Photos uses cryptographically protected computing to scan photos without seeing them. It\u2019s based on a tool called <a href=\"https:\/\/blog.openmined.org\/private-set-intersection\/\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">private set intersection<\/a> that has been studied by cryptographers since the 1980s. This tool allows two people to discover files that they have in common while hiding the rest.<\/p>\n<p>Here\u2019s how the image matching works. Apple distributes to everyone\u2019s iPhone, iPad, and Mac a database containing indecipherable encodings of known child abuse images. For each photo that you upload to iCloud, your device <a href=\"https:\/\/www.apple.com\/child-safety\/pdf\/Expanded_Protections_for_Children_Technology_Summary.pdf\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">applies a digital fingerprint<\/a>, called NeuralHash. The fingerprinting works even if someone makes small changes in a photo. Your device then creates a voucher for your photo that your device can\u2019t understand, but that tells the server whether the uploaded photo matches child abuse material in the database.<\/p>\n<p>If enough vouchers from a device indicate matches to known child abuse images, the server learns the secret keys to decrypt all of the matching photos \u2013 but not the keys for other photos. Otherwise, the server cannot view any of your photos.<\/p>\n<p>Having this matching procedure take place on your device can be better for your privacy than the previous methods, in which the matching takes place on a server \u2013 if it\u2019s deployed properly. But that\u2019s a big caveat.<\/p>\n<h2>Figuring out what could go wrong<\/h2>\n<p>There\u2019s a <a href=\"https:\/\/www.youtube.com\/watch?v=XLMDSjCzEx8\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">line in the movie \u201cApollo 13\u201d<\/a> in which Gene Kranz, played by Ed Harris, proclaims, \u201cI don\u2019t care what anything was designed to do. I care about what it can do!\u201d Apple\u2019s phone scanning technology is designed to protect privacy. Computer security and tech policy experts are trained to discover ways that technology can be used, misused, and abused, regardless of its creator\u2019s intent. However, Apple\u2019s announcement <a href=\"https:\/\/twitter.com\/mattblaze\/status\/1423474134202437637\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">lacks information to analyze essential components<\/a>, so it is not possible to evaluate the safety of its new system.<\/p>\n<p>Security researchers need to see Apple\u2019s code to validate that the device-assisted matching software is faithful to the design and doesn\u2019t introduce errors. Researchers also must test whether it\u2019s possible to fool Apple\u2019s NeuralHash algorithm into changing fingerprints by <a href=\"https:\/\/twitter.com\/yvesalexandre\/status\/1423293697152610314\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">making imperceptible changes to a photo<\/a>.<\/p>\n<p>It\u2019s also important for Apple to develop an auditing policy to hold the company accountable for matching only child abuse images. The threat of mission creep was a risk even with server-based matching. The good news is that matching devices offer new opportunities to audit Apple\u2019s actions because the encoded database binds Apple to a specific image set. Apple should allow everyone to check that they\u2019ve received the same encoded database and third-party auditors to validate the images contained in this set. These public accountability goals <a href=\"https:\/\/www.bu.edu\/riscs\/2021\/08\/10\/apple-csam\/\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">can be achieved using cryptography<\/a>.<\/p>\n<p>Apple\u2019s proposed image-matching technology has the potential to improve digital privacy and child safety, especially if Apple follows this move by <a href=\"https:\/\/www.reuters.com\/article\/us-apple-fbi-icloud-exclusive\/exclusive-apple-dropped-plan-for-encrypting-backups-after-fbi-complained-sources-idUSKBN1ZK1CT\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">giving iCloud end-to-end encryption<\/a>. But no technology on its own can fully answer complex social problems. All options for how to use encryption and image scanning have <a href=\"https:\/\/mobile.twitter.com\/alexstamos\/status\/1424054544556646407\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">delicate, nuanced effects<\/a> on society.<\/p>\n<p>These delicate questions require time and space to reason through potential consequences of even well-intentioned actions before deploying them, through <a href=\"https:\/\/cyber.fsi.stanford.edu\/io\/content\/e2ee-workshops\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">dialogue<\/a> with affected groups and researchers with a wide variety of backgrounds. I urge Apple to join this dialogue so that the research community can collectively improve the safety and accountability of this new technology.<!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https:\/\/theconversation.com\/republishing-guidelines --><\/p>\n<p><em>Article by <a href=\"https:\/\/theconversation.com\/profiles\/mayank-varia-503584\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">Mayank Varia<\/a>, Research Associate Professor of Computer Science, <a href=\"https:\/\/theconversation.com\/institutions\/boston-university-898\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">Boston University<\/a><\/em><\/p>\n<p><em>This article is republished from <a href=\"https:\/\/theconversation.com\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">The Conversation<\/a> under a Creative Commons license. Read the <a href=\"https:\/\/theconversation.com\/apple-can-scan-your-photos-for-child-abuse-and-still-protect-your-privacy-if-the-company-keeps-its-promises-165785\" target=\"_blank\" rel=\"nofollow noopener noreferrer\">original article<\/a>.<\/em><\/p>\n<p> <a href=\"https:\/\/thenextweb.com\/news\/examination-apples-plans-to-scan-your-iphone-photos-abusive-content-syndication\">Source<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>The proliferation of child sexual abuse material on the internet is harrowing and sobering. Technology companies send tens of millions of reports per year of these images to the nonprofit National Center&#8230;<\/p>\n","protected":false},"author":1,"featured_media":7137,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[1],"tags":[],"_links":{"self":[{"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/posts\/7136"}],"collection":[{"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=7136"}],"version-history":[{"count":0,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/posts\/7136\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=\/wp\/v2\/media\/7137"}],"wp:attachment":[{"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=7136"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=7136"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.londonchiropracter.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=7136"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}