Doc O'Leary , <
drol...@2017usenet1.subsume.com> writes:
> Russ Allbery <
ea...@eyrie.org> wrote:
>> There are indeed organized groups of peoople who are trying to trade
>> CSAM and will use your service to do it if you let them, and there's
>> real abuse of real people underneath it.
> So you put in the effort to *solve* that problem, not sweep it under the
> rug, which is all you accomplish when you start deleting data.
The specific thing you are required to do by US law (in my personal
understanding, I am not a lawyer, this is not legal advice for your
specific situation, read it for yourself at [1]) is report the data to
NCMEC and then make it inaccessible, not delete it (which would be
destroying evidence), so in that narrow sense I do actually agree with
you. I suspect this is similar in most other jurisdictions, although
obviously NCMEC is a US thing so the details will vary.
[1]
https://www.law.cornell.edu/uscode/text/18/2258A
However, I personally would rather juggle raw plutonium than spend any
time handling that kind of legal evidence and therefore opt out of the
entire problem by not carrying binaries, since otherwise I am legally
obligated to spend whatever time it takes me to handle that data properly
should any problem arise. Since personally I don't care about any of the
binaries anyway (there are numerous better sources for any non-textual
information I want than Usenet), this seems like it would be a very bad
use of my time, even apart from the fact that if I ever got a report I
would have to go look at the reported data to see if the report was
correct and that does not sound fun. Likewise for DMCA, which is less
legally fraught but still highly annoying.
Other people's mileage may vary, and that's fine! Just know what you're
getting into and realize that no one involved in enforcing laws cares in
the slightest what your opinion is of those laws, so it's worth carefully
picking what types of civil disobedience you want to engage in.
Personally, I vote against copyright bullshit but don't engage in civil
disobedience around it. That's where my personal risk tradeoff is.
> My point is always going to come back to the fact that few people are
> actually interested in solving abuse on *any* platform.
Well, I have spent a bunch of time working professionally with people who
are trying to reduce platform abuse and I have a huge amount of respect
for the work that they do. I don't think anyone who works professionally
in the field thinks this problem is *solvable*. It's a classic
adversarial security problem, and those are almost never solvable. You
would have to make all of your opponents permanently disappear, and, well,
good luck with that. Governments have been trying for millennia. The
best you can manage is harm reduction.
You can do various things to make it easier and various things to make it
harder. One of the most effective things you can do to make dealing with
abuse easier is to ban all non-textual media, because that takes a lot of
the most annoying, dangerous, or horrific types of abuse off the table.
Obviously, that's a tradeoff, and if everyone made that tradeoff that
would be sad for society since there are a lot of non-textual things that
are worth sharing. But I leave hosting the non-textual stuff to people
with platform abuse teams and lawyers on retainer. Again, that's where my
risk tradeoff is.
I think people who think fully decentralized annoymous (or even mostly
anonymous) non-textual file sharing is something they want to get involved
in are completely out of their gourd because you're just hanging up a
giant sign saying "trade all of your illegal shit here, viruses welcome,"
but I'm not the government and I'm not going to stop you. My only goal
here is to make sure people at least go into it with their eyes open.
> But it’s worse than that! It’s gone the other way: cloud platforms have
> business models where the profit motive is in *supporting* that evil
> shit.
I wanted to say that this is definitely not true, but I think I can see
how one might see that this is true from a particular angle. It is true
that in pursuit of profits, a bunch of companies have built network
platforms that make abuse much easier, and are now desperately trying to
play catch-up to filter out the shit that they don't want to carry. There
is an interesting argument to be made that social media itself is the
problem and we should not have any social media at all because it enables
abuse. I don't think I *agree* with that argument, but one can certainly
make that argument coherently, and it's gaining some social popularity.
(This is what repealing section 230 in the US would mean: make social
media effectively illegal by making every service provider liable for
everything they carry. And there are a fair number of advocates for that,
although not all of them understand what they're advocating.)
But if you mean the cloud providers are happy about or actively encourage
people doing evil shit like CSAM on their platforms, this is absolutely
100% not true and I know it's not true from direct personal experience.
Cloud platforms spend large quantities of money, hire whole teams of very
expensive people, and write whole new algorithms and scanning methods to
try to get rid of shit like CSAM. It's a significant expense; it is
absolutely not a profit center. They do that in part because people who
work for cloud platforms are human and have normal human feelings about
CSAM, in part because it's a public relations nightmare, and in part
because not doing some parts of that work is illegal.