Apple SCANNING your iPhone to check for CHILD ABUSE – THIS IS NOT PRIVACY!


? This video was recorded with the following:
? Camera:
? Microphone:
? HDMI capture:
? Audio interface:

Written by Louis Rossmann


  1. All these rules and laws start with a innocent name with not so innocent intentions. Child abuse protection aka we are going to start monitoring your pictures. Next will be verbal assault or unwanted sexting rule aka we gotta monitor texts. Then the unlawful crimes over the phone rule that allows phone calls to be monitored.

  2. i have spent 25 years dealing with the aftermath of children being sexually abused… we have even fostered kids that have had to deal with this.. i want to stop it more than anyone, ya it happened to me too.. so 50 years i have dealt with this. so ya i want it stopped. but not this way… this way is gonna end horribly wrong…..

  3. Apple is rather confusing in their supposed convictions of privacy. There was a shooter in San Bernardino they wouldn't help the fbi for privacy reasons. Yet now they have no privacy concerns for a private company, not a law enforcement agency looking through everyone's photos?

  4. Jesus. Could you imagine working on the "training" of the system for THAT? Either that has to be the most disgusting job in tech, or the system was left to its own devices to find the images (via categorization) and then train itself, ultimately leading to a welter of false positives (how many iPhone fans with pictures of their little kids are going to be getting calls from the police, or worse – SWATted?). This strikes me as giving into the Q-Anon imbeciles who see child pron everywhere except where it really exists (in their own ranks much of the time, which is obviously why they're deflecting so hard). The Cambridge security expert is right-on in this instance.

  5. I can already think of plenty of situations where an algorithm would mess this up.
    If you were recording an interaction with your child and let's they, I don't know… scrape their knee and come over to you crying, I would think an AI would think that's abuse because it doesn't have any context.

  6. I believe that the hashing that they are referring to is Locality Sensitive Hashing (LSH). It is a technique that has been used for finding images in very large image collections, and it doesn't necessarily have to be an exact hash match. On top of the LSH has to be another algorithm to 'compare' the results of the hashing. I'm going to guess that this algorithm is a Neural Network model given the 'neuralMatch' name.

    Not that this information changes anything.

  7. lol they probably are already doing something like this already… But how would the darn AI even know what to look for. All very weird all very 1984…

  8. just like Australia banning mature rated games, or UK confiscating even butter knives, this world- or the assholes that want to control it- try to create nanny states so that its citizens wouldnt be subject to anything unpleasant.

  9. I’m probably the guy Louis talked about near the end. – personally digital privacy has never meant a whole lot to me – perhaps because I’m in my 20s and grew up with this stuff. My main concern for privacy is from ppl I already know – was never too worried about a guy who I don’t know and will never meet who’s job it is to sort through this stuff. I respect yalls desire to protect ur own digital privacy but personally don’t really get it.