Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Assume, just for the sake of the discussion, obviously, that it may be possible to determine, without violating their privacy (and actually without doing a damn thing), that someone might be a child?

Will this help?

(There’s an elephant in the room. All-of-internet wants it straw elephant out (I think). The lawyers just got confused.)



You will still have false positives. You will either have false positives in that children will slip through, which the FTC is not a fan of. Or you will have false-positives in that adults will be unfairly restricted (which everyone else should have a problem with).

Even in that scenario, content filtering is more feasible and practical than detecting and banning children.

But, pushing that problem aside as well, sure. Assuming that you didn't have false positives, and assuming that you didn't have to violate anyone's privacy, and assuming that the solution wasn't obscenely expensive, that theoretical solution would be very helpful. It would basically solve the entire problem, for pretty much everyone.

But that theoretical solution doesn't exist.

If we're willing to make those assumptions, I would also like a special gun that only shoots bad people and doesn't work in schools or churches. And it would be nice to give law enforcement with warrants access to lawful information without requiring encryption backdoors. Of course, we'll also need to implement the new training regiment that gets rid of any corrupt prosecutors or officers.

I'll happily posit things for the sake of discussion, but there's a point where I'm suspending so much reality that the discussion just breaks down. I can't stop looking at the proposed solution and saying, "yeah but this thing you're talking about doesn't work." I guess I'm just not sure what you're getting at.


”adults will be unfairly restricted” from what?

Asking for a friend.


Let's say Youtube's theoretical solution labels you as a child, even though you're not. Without some kind of way to get around that block, you've now lost access to Youtube.

Note that this scenario isn't entirely theoretical. It's rare, but there are already subsets of Youtube (trailers for horror movies are where I see this most often) that are locked behind a login screen because the uploader has indicated they're intended for mature audiences. You either log into Youtube (sacrificing some privacy) or you just don't get access to those videos.

This ends up being not a huge problem today, because the amount of content falling into that category is very narrow. Imagine a world where you fail Youtube's theoretical privacy-preserving test, and suddenly you can't watch anything, regardless of the content, that isn't designated as being specifically directed at children.

Of course, on some level that's just Youtube's choice. But now imagine if a government agency like the FTC dictates that every large Youtube-scale platform has to implement this test. Suddenly you find yourself not only unable to watch videos on Youtube, but also unable to make a Facebook/Twitter account. Imagine if you can't search for certain things on Google.

Note that this final scenario also isn't entirely theoretical with the FTC. My reading of the comments that the FTC commissioner has made is that the value of targeting Youtube is specifically that it's a centralized choke-point for this kind of content. I don't see any reason to assume that if creators migrate off of Youtube, that the FTC won't follow them and impose similar rules wherever they end up.


It’s about privacy not content.

(I have a lot to say about content, including suggesting specific solutions to specific problems, but I’ve decided to not air this in a public forum. Further discussion is pay gated. Sponsor me on github for more !!! :))




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: