Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Analogies make shit arguments when we can discuss the matter at hand directly instead. Systems with central access controls are vulnerable to at least 2 separate sorts of attacks. One procedural and one technical end.

On the procedural side its trivial to degenerate from

* subjects privacy is breached when the lawfully constituted authorities present proof to a judge of reasonable suspicion that subject has engaged in antisocial and immoral acts against society or its members

to

* it would be be awfully nice if we had access to everyone's personal data so that we could punish anyone we like for anything we like

In the United States we have already standardized on breaching everyone's privacy to the greatest degree possible with little or no recourse for the citizenry. Once you have all the information on whom speaks or believes in a certain way acting against them is a relatively smaller step.

On the technical side a centralized system is impossible to secure. When not if it is compromised to the victors go the spoils. It is comparatively harder at least to attack millions of clients even if in theory a central node could be used to compromise clients this can in practice trivially be made harder to exploit by just not allowing server to push to clients and not updating every second. Attacks may need to remain undetected for a substantial length of time before a substantial portion of the user base is poisoned and the chance of detection climbs towards 100% the more parties you attack.

Furthermore if you become 100% effective at detecting illegal porn shared via whatsapp doesn't it follow that users will use a decentralized means of transmitting illegal information. We would be substantially disadvantaging the population as a whole for little gain.



“Illegal porn” isn’t the preferred phrase in this case. It’s “child sexual abuse images.” We’re not talking about pornography here.


We ARE talking about porn I'm not sure how you can be confused on that point. Nobody is defending the act of abusing kids or scumbags sharing media of same. Both acts are evil.

The biggest point is that backdooring everyone's communication will do exactly jack to prevent child abuse. It wont even make it much harder for scumbags to share it even more of them will use just use privacy preserving p2p applications. In fact if you fuck with everyone's sense of privacy you will be liable to push a much much larger portion of the population to use such tech.

The greater number of normal people who aren't drug dealers, crypto nerds, or pedos using stuff like TOR the harder it will be to pick out the weirdos. Meanwhile most of the harm to children wont be captured on digital media because it will keep happening when trusted adults abuse minors they ought to be protecting instead of harming.

Attacking the sexual abuse problem in America by backdooring communication platforms is about as effective as reducing deaths due to traffic accidents by hanging out in parking lots.


> Meanwhile most of the harm to children wont be captured on digital media because it will keep happening when trusted adults abuse minors they ought to be protecting instead of harming.

Can you explain what you’re getting at here? I don’t get how this fits into your argument.


The problem they are highlighting is child abuse. Child abuse is indeed a horrible thing. It's also a very hard thing to combat because it largely happens when a trusted adult abuses a child and mostly doesn't document it for posterity or share it on the internet. The victims face the double whammy of shame and social pressure not to accuse the victimizer.

Attacking the perverts sharing pictures is worthy but does almost nothing to prevent child abuse. The best case scenario is you expose a small portion of perverts and thus prevent idiots from trusting their kids with them.

Unfortunately attacking messaging services will do little to combat perverts sharing pictures as they can trivially switch to using slightly better technology.

The effect of multiplicative. Imagine you start with 1% of victimizers sharing media online. Imagine that you catch 1/10 of 1% of them before the rest figure out centralized sharing is not the best idea. You are now stuck with permanent downsides and will catch few additional people.

0.01 * 0.001 = 0.00001

The logical conclusion is that this isn't a very great way to combat child abuse.


Great! But you seem to be the one who’s confused. I was commenting on semantics. The “illegal porn” being referenced above is not porn. It is abuse.

Carry on with your discussion of why backdooring encryption is bad. I wasn’t commenting on that matter. This entire thread has no real discussion of how to solve the “encryption-while-scanning problem.” With the massive growth in CSAM sharing in the last few years, enabled by easy to use services like facebook and dropbox, it’s clear that solving this dilemma is just as important as protecting some normal person’s ability to use TOR. I’m extremely privacy focused, but I’m curious to learn if crippling fb/dropbox/etc is worth it for the sake of solving this issue.

Also, “traffic accidents” isn’t preferred, either! Do you hate me now? Language is important. I’m a transportation planner who focuses on safety, so I can talk about this all day.


You CAN'T solve the encrypted while scanning situation. AI is capable of misidentifying a naked baby picture as porn. Nobody will want to run a piece of software that might accidentally report them to the FBI to have their life ruined. So long as users have the ability to write and distribute software from regions outside your jurisdiction and users have the ability to install software of their choosing you can't stop people from communicating information you don't approve of. This is equally true of political discourse you and I both agree ought to be freely disseminated as is of evil material you and I would both desire to ban.

Maximal crazy is implementing 1984 in order to pretend to stop child abuse while it continues to go on all around us.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: