Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

ignorant fucks in congress

3 views
Skip to first unread message

Unknown

unread,
Feb 2, 2022, 1:21:01 AM2/2/22
to
You may recall the terrible and dangerous EARN IT Act from two years ago,
which was a push by Senators Richard Blumenthal and Lindsey Graham to chip
away more at Section 230 and to blame tech companies for child sexual abuse
material (CSAM). When it was initially introduced, many people noticed that
it would undermine both encryption and Section 230 in a single bill. While
the supporters of the bill insisted that it wouldn't undermine encryption,
the nature of the bill clearly set things up so that you either needed to
encrypt everything or to spy on everything. Eventually, the Senators were
persuaded to adopt an amendment from Senator Patrick Leahy to more
explicitly attempt to exempt encryption from the bill, but it was done in a
pretty weak manner. That said, the bill still died.
But, as with 2020, 2022 is an election year, and in an election year some
politicians just really want to get their name in headlines about how
they're "protecting the children," and Senator Richard Blumenthal loves the
fake "protecting the children" limelight more than most other Senators. And
thus he has reintroduced the EARN IT Act, claiming (falsely) that it will
somehow "hold tech companies responsible for their complicity in sexual
abuse and exploitation of children." This is false. It will actually make
it more difficult to stop child sexual abuse, but we'll get there. You can
read the bill text here, and note that it is nearly identical to the
version that came out of the 2020 markup process with the Leahy Amendment,
with a few very minor tweaks. The bill has a lot of big name Senators as
co-sponsors, and that's from both parties, suggesting that this bill has a
very real chance of becoming law. And that would be dangerous.
If you want to know just how bad the bill is, I found out about the
re-introduction of the bill -- before it was announced anywhere else -- via
a press release sent to me by NCOSE, formerly "morality in media," the
busybody organization of prudes who believe that all pornography should be
banned. NCOSE was also a driving force behind FOSTA -- the dangerous law
with many similarities to EARN IT that (as we predicted) did nothing to
stop sex trafficking, and plenty of things to increase the problem of sex
trafficking, while putting women in danger and making it more difficult for
the police to actually stop trafficking.
Amusingly (?!?) NCOSE's press release tells me both that without EARN IT
tech platforms "have no incentive to prevent" CSAM, and that in 2019 tech
platforms reported 70 million CSAM images to NCMEC. They use the former to
insist that the law is needed, and the latter to suggest that the problem
is obviously out of control -- apparently missing the fact that the latter
actually shows how the platforms are doing everything they can to stop CSAM
on their platforms (and others!) by following existing laws and reporting
it to NCMEC where it can be put into a hash database and shared and blocked
elsewhere.
But facts are not what's important here. Emotions, headlines, and votes in
November are.
Speaking of the lack of facts necessary, with the bill, they also have a
"myth v. fact" sheet which is just chock full of misleading and simply
incorrect nonsense. I'll break that down in a separate post, but just as
one key example, the document really leans heavily on the fact that Amazon
sends a lot fewer reports of CSAM to NCMEC than Facebook does. But, if you
think for more than 3 seconds about it (and aren't just grandstanding for
headlines) you might notice that Facebook is a social media site and Amazon
is not. It's comparing two totally different types of services.
However, for this post I want to focus on the key problems of EARN IT. In
the very original version of EARN IT, the bill created a committee to study
if exempting CSAM from Section 230 would help stop CSAM. Then it shifted to
the same form it's in now where the committee still exists, but they skip
the part where the committee has to determine if chipping away at 230 will
help, and just includes that as a key part of the bill. The 230 part mimics
FOSTA (again which has completely failed to do what it claimed and has made
the actual problems worse), in that it adds a new exemption to Section 230
that exempts any CSAM from Section 230.
EARN IT will make the CSAM problem much, much worse.
At least in the FOSTA case, supporters could (incorrectly and misleadingly,
as it turned out) point to Backpage as an example of a site that had been
sued for trafficking and used Section 230 to block the lawsuit. But here...
there's nothing. There really aren't examples of websites using Section 230
to try to block claims of child sexual abuse material. So it's not even
clear what problem these Senators think they're solving (unless the problem
is "not enough headlines during an election year about how I'm protecting
the children.")
The best they can say is that companies need the threat of law to report
and takedown CSAM. Except, again, pretty much every major website that
hosts user content already does this. This is why groups like NCOSE can
trumpet "70 million CSAM images" being reported to NCMEC. Because all of
the major internet companies actually do what they're supposed to do.
And here's where we get into one of the many reasons this bill is so
dangerous. It totally misunderstands how Section 230 works, and in doing so
(as with FOSTA) it is likely to make the very real problem of CSAM worse,
not better. Section 230 gives companies the flexibility to try different
approaches to dealing with various content moderation challenges. It allows
for greater and greater experimentation and adjustments as they learn what
works -- without fear of liability for any "failure." Removing Section 230
protections does the opposite. It says if you do anything, you may face
crippling legal liability. This actually makes companies less willing to do
anything that involves trying to seek out, take down, and report CSAM
because of the greatly increased liability that comes with admitting that
there is CSAM on your platform to search for and deal with.
EARN IT gets the problem exactly backwards. It disincentivizes action by
companies, because the vast majority of actions will actually increase
rather than decrease liability. As Eric Goldman wrote two years ago, this
version of EARN IT doesn't penalize companies for CSAM, it penalizes them
for (1) not magically making all CSAM disappear, for (2) knowing too much
about CSAM (i.e., telling them to stop looking for it and taking it down)
or (3) not exiting the industry altogether (as we saw a bunch of dating
sites do post FOSTA).
EARN IT is based on the extremely faulty assumption that internet companies
don't care about CSAM and need more incentive to do so, rather than the
real problem, which is that CSAM has always been a huge problem and
stopping it requires actual law enforcement work focused on the producers
of that content. But by threatening internet websites with massive
liability if they make a mistake, it actually makes law enforcement's job
harder, because they will be less able to actually work with law
enforcement. This is not theoretical. We already saw exactly this problem
with FOSTA, in which multiple law enforcement agencies have said that FOSTA
made their job harder because they can no longer find the information they
need to stop sex traffickers. EARN IT creates the exact same problem for
CSAM.
So the end result is that by misunderstanding Section 230, by
misunderstanding internet company's existing willingness to fight CSAM,
EARN IT will undoubtedly make the CSAM problem worse by making it more
difficult for companies to track CSAM down and report it, and more
difficult for law enforcement to track down an arrest those actually
responsible for it. It's a very, very bad and dangerous bill -- and that's
before we even get to the issue of encryption!
EARN IT is still very dangerous for encryption
EARN IT supporters claim they "fixed" the threat to encryption in the
original bill by using text similar to Senator Leahy's amendment to say
that using encryption cannot "serve as an independent basis for liability."
But, the language still puts encryption very much at risk. As we've seen,
the law enforcement/political class is very quick to want to (falsely)
blame encryption for CSAM. And by saying that encryption cannot serve as
"an independent basis" for liability, that still leaves open the door to
using it as one piece of evidence in a case under EARN IT.
Indeed, one of the changes to the bill from the one in 2020 is that
immediately after saying encryption can't be an independent basis for
liability it adds a new section that wasn't there before that effectively
walks back the encryption-protecting stuff. The new section says: "Nothing
in [the part that says encryption isn't a basis for liability] shall be
construed to prohibit a court from considering evidence of actions or
circumstances described in that subparagraph if the evidence is otherwise
admissable." In other words, as long as anyone bringing a case under EARN
IT can point to something that is not related to encryption, it can point
to the use of encryption as additional evidence of liability for CSAM on
the platform.
Again, the end result is drastically increasing liability for the use of
encryption. While no one will be able to use the encryption alone as
evidence, as long as they point to one other thing -- such as a failure to
find a single piece of CSAM -- then they can bring the encryption evidence
back in and suggest (incorrectly) some sort of pattern or willful
blindness.
And this doesn't even touch on what will come out of the
"committee" and its best practices recommendations, which very well might
include an attack on end-to-end encryption.
The end result is that (1) EARN IT is attacking a problem that doesn't
exist (the use Section 230 to avoid responsibility for CSAM) (2) EARN IT
will make the actual problem of CSAM worse by making it much more risky for
internet companies to fight CSAM and (3) EARN IT puts encryption at risk by
potentially increasing the liability risk of any company that offers
encryption.
It's a bad and dangerous bill and the many, many Senators supporting it for
kicks and headlines should be ashamed of themselves.

dtsa...@umassd.edu <zim...@daktel.com.someone@apple.com> wrote:
>
> Sen. Chuck Grassley (R-Iowa) joined Sens. Lindsey Graham (R-S.C.) and
> Richard Blumenthal (D-Conn.) to reintroduce bipartisan legislation
> encouraging the tech industry to take online child sexual exploitation
> seriously. The Eliminating Abusive and Rampant Neglect of Interactive
> Technologies (EARN IT Act) removes blanket immunity for violations of laws
> related to online child sexual abuse material (CSAM).
>
> “Online service providers have long acknowledged their responsibility to
> enforce user policies and moderate certain content, but many have not done
> enough to combat child predators who use these platforms to exploit and
> victimize children. This bill would ensure that online service providers
> that fail to crack down on such content are not able to escape consequence
> by hiding behind Section 230 immunity. This commonsense bill received
> unanimous bipartisan support in the Judiciary Committee last Congress, and
> it’s time we get it on the books to help prevent future child exploitation
> online,” Grassley said.
>
> Highlights of the EARN IT Act:
>
> Creates a strong incentive for the tech industry to take online child
> sexual exploitation seriously. The bill amends Section 230 of the
> Communications Decency Act to remove blanket immunity from Federal civil,
> State criminal and State civil child sexual abuse material laws entirely.
> Service providers will now be treated like everyone else when it comes to
> combating child sexual exploitation and eradicating CSAM, creating
> accountability.
> Establishes a National Commission on Online Child Sexual Exploitation
> Prevention that will be responsible for developing voluntary best
> practices. The Commission consists of the heads of DOJ, DHS and FTC, along
> with 16 other members appointed equally by Congressional leadership,
> including representatives from: law enforcement, survivors and victims’
> services organizations, constitutional law experts, technical experts and
> industry.
> Recourse for survivors and tools for enforcement. The bill bolsters
> enforcement of child sexual abuse material statutes and allows survivors
> civil recourse.
> In July 2020, the EARN IT Act (S. 3398) passed the Senate Judiciary
> Committee unanimously. Section 230 of the Communications Decency Act gives
> “interactive computer services” significant immunity from civil liability,
> as well as state criminal liability for third party content on their
> platforms. Given this limited liability, many companies do not aggressively
> go after online child sexual exploitation.
>
> The EARN IT Act is supported by more than 240 groups, survivors and
> stakeholders, including the National Center for Missing & Exploited
> Children (NCMEC), Rights4Girls, the National Center on Sexual Exploitation,
> National District Attorneys Association, National Association of Police
> Organizations, Rape, Abuse & Incest National Network, International Justice
> Mission and Major Cities Chiefs Association.
>
> Along with Graham, Blumenthal and Grassley, the legislation is cosponsored
> by Sens. Dick Durbin (D-Ill.), Josh Hawley (R-Mo.), Dianne Feinstein
> (D-Calif.), Joni Ernst (R-Iowa), Bob Casey (D-Penn.), Sheldon Whitehouse
> (D-R.I.), John Kennedy (R-La.), Mazie Hirono (D-Hawaii), Rob Portman
> (R-Ohio), Lisa Murkowski (R-Alaska), John Cornyn (R-Texas), Catherine
> Cortez Masto (D-Nev.), Marsha Blackburn (R-Tenn.), Susan Collins (R-Maine),
> Maggie Hassan (D-N.H.), Cindy Hyde-Smith (R-Miss.) and Mark Warner (D-Va.).
> Representatives Ann Wagner (R-Miss.) and Sylvia Garcia (D-Texas) introduced
> companion legislation in the House of Representatives.
>



--
donald trump sucks
0 new messages