Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'd view a "grey blob" to be the MVP of hash collisions. I doubt that this will end with grey blobs - I see it ending with common images (memes would be great for this) being invisibly altered to collide with a CSAM hash.


If you need a judicial review to confim that a slightly altered Bernie in Coat and Gloves meme is not the same image as the picture of a child being raped that they have on file then we have way bigger problems.


Here's the thing with CSAM - it's illegal to view and transmit. So nobody, until the police have confiscated your devices, will actually be able to verify that it is a "child being raped."

They'll view visual hashes, look at descriptions, and so forth, but nobody from Apple will actually be looking at them, because then they are guilty of viewing and transmitting CSAM.

I noted in another comment, even the prosecutors and defense lawyers in the case typically only get a description of the content, they don't see it themselves.


This is just not true. Human review is conducted. Apple will conduct human review, facebook conduct human review, NCMEC will conduct human review, law enforcement will conduct human review, lawyers and judges will conduct human review.

Over the years there have been countless articles etc about how fucked up being a reviewer of content flagged at all the tech companies is. https://www.vice.com/en/article/a35xk5/facebook-moderators-a...


Where did you get this idea, scooby doo?

It is not illegal to be an unwilling recipient of illegal material. If a package shows up at your door with a bomb, you're not gonna be thrown in jail for having a bomb.


In theory, sure.

At the very least, you'd be one of the primary suspects, and if you somehow got a bad lawyer, all bets are off.

https://hongkongfp.com/2021/08/12/judges-criticise-hong-kong...


Okay, and when a cursory look at the bomb actually reveals it to be a fisher-price toy, what then?

What is the scenario where a grey blob gets on your phone that sets off CSAM alerts, an investigator looks at it and sees only a grey blob, and then still decides to alert the authorities even though it's just a grey blob, and the authorities still decide to arrest you even though it's just a grey blob, and the DA still decides to prosecute you even though it's just a grey blob, and a jury still decides to convict you, even though it's still just a grey blob?

You're the one who's off in theory-land imagining that every person in the entire justice system is just as stupid as this algorithm is.


Possession of CSAM is a strict liability crime in most jurisdictions.


That is simply not true. There is no American jurisdiction where child pornography is a strict liability crime.

On this topic, the Supreme Court has ruled in Dickerson v US that, in all cases, to avoid First Amendment conflicts, all child pornography statutes must be interpreted with at least a "reckless disregard" standard.

Here is a typical criminal definition, from Minnesota, where a defendant recently tried to argue that the statute was strict liability and therefore unconstitutional, and that argument was rejected by the courts because it is clearly written to require knowledge and intent:

> Subd. 4. Possession prohibited. (a) A person who possesses a pornographic work or a computer disk or computer or other electronic, magnetic, or optical storage system ․ containing a pornographic work, knowing or with reason to know its content and character, is guilty of a felony․


I tried to look up [Antonio] Dickerson v. US, but I don't see any SCOTUS decision on it, only a certiorari petition. Do you have a reference for the decision?


Yep, sorry, Dickerson was an appellant that cited the relevant case law, which is New York v Ferber.

Now, Dickerson rightfully lost and it's appropriate that SCOTUS rejected his case because he was involved in child porn production, not posession, so he can't rely on the Ferber precedent. He had the opportunity to ask the underage person in question their age, and chose not to, which would meet the reckless disregard standard anyway.


I don't see any mention of the reckless disregard standard in the New York v. Ferber decision, either? So far as I can see, it just says that CSAM is outside of the scope of 1A, so long as it's "adequately defined by the applicable state law". Am I missing something in the opinion?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: