14/07/2023

  The story so far:
In the 90s the Internet was created.
This has made a lot of people very angry and been widely regarded as a bad move.

(with apologies to Douglas Adams)[1]
Theres a new bill afoot in Congress called the EARN IT Act. A discussion draft released by Bloomberg is available as a PDF here. This bill is trying to convert your anger at Big Tech into law enforcements long-desired dream of banning strong encryption. It is a bait-and-switch. Dont fall for it. Im going to explain where it came from, why its happening now, why its such an underhanded trick, and why it will not work for its stated purpose. And Im only going to barely scratch the surface of the many, many problems with the bill.
The 1990s: Congress Passes Section 230 and CALEA
In the 1990s, Congress passed several pieces of legislation that helped shape the Internet as we currently know it. Today I want to talk about two of them. One was Section 230 of the Communications Decency Act of 1996 (CDA). Section 230 says, in essence, that online platforms (providers of interactive computer services) mostly cant be held liable for the things their users say and do on the platform.[2] For example: If you defame me on Twitter, I can sue you for defamation, but I cant sue Twitter. Without the immunity provided by Section 230, there might very well be no Twitter, or Facebook, or dating apps, or basically any website with a comments section. They would all have been sued out of existence, or never started up at all, in light of the potentially crushing legal liability to which theyd be exposed without Section 230. 
The other significant law from the 1990s that I want to talk about today is the Communications Assistance for Law Enforcement Act of 1994, or CALEA for short. CALEA requires telecommunications carriers (e.g., phone companies) to make their networks wiretappable for law enforcement. However, that mandate does not cover information services: websites, email, social media, chat apps, cloud storage, and so on. Put another way, the providers of information services are not required to design to be surveillance-friendly. Lets call that the information services carve-out in CALEA. Plus, even covered entities are free to encrypt communications and throw away the keys to decrypt them. Lets call that the encryption carve-out. As my colleague, veteran telecom lawyer Al Gidari, explained in a 2016 blog post, those carve-outs represent a compromise among competing interests, such as law enforcement, network security, privacy, civil liberties, and technological innovation. In the quarter-century since it was passed, CALEA has never been amended.
In passing these two laws, Congress made a wise policy choice not to strangle the young Internet economic sector in its cradle. Exposing online services to crippling legal liability would (among other things) inhibit the free exchange of information and ideas; mandating that information services be surveillance-friendly to the U.S. government would (among other things) hurt their commercial viability in foreign markets. Congress chose instead to encourage innovation in the Internet and other new digital technologies. And the Internet bloomed. (In the 90s the Internet was created.) 
Here in 2020, the Internet sector (and tech more broadly) is a huge economic driver in the U.S. Thanks in part to the efforts of U.S.-based companies, software has eaten the world. Billions of humans can connect with each other. And yet nobody really seems to enjoy being online very much anymore, because it turns out that humans are terrible. (This has made a lot of people very angry and been widely regarded as a bad move.)
2020: People Are Mad About Section 230
Years of imbibing a concentrated font of human venality every time we open our phones, coupled with the metastatic growth of surveillance capitalism, have birthed the current, bipartisan techlash. The techlash is taking several forms, among them the growing zeal for amending or outright repealing Section 230. The idea is that Section 230 is no longer needed; its served its original purpose, if anything it was too successful, and now U.S. tech companies have outgrown it — and grown too big for their britches, period. The harm that Section 230 allows, the thinking goes, now outweighs the good, and so the time has come to hold providers accountable for that harm.
Of course, human terribleness wasnt invented in the 21st century and it is not the Internets fault. If men were angels, no CALEA or Section 230 would have been necessary.[3] As Balks First Law holds, Everything you hate about The Internet is actually everything you hate about people. Section 230 has kept the providers of the former largely immune from being held accountable for the online abuse of the latter. But in the age of the techlash, that immunity has been slowly eroding. 
Unlike CALEA, Section 230 has been amended since it was passed: SESTA/FOSTA, enacted in 2018, pierces providers immunity from civil and state-law claims about sex trafficking. Just as pretty much everybody predicted, SESTA/FOSTA has turned out to endanger sex workers instead of protecting them, and is currently being challenged as unconstitutional in federal court (including by my colleague Daphne Keller).
Now, riding on the success of SESTA/FOSTA, Senators Lindsey Graham (R-SC) and Richard Blumenthal (D-CT) (who were among SESTA/FOSTAs early cosponsors in the Senate) are reportedly about to introduce another bill that would take another bite out of Section 230 immunity, according to The Information and Bloomberg. This time, the bills target is child sex abuse material (CSAM) online. The idea of the bill is to create a federal commission that will develop best practices for combatting CSAM online, which online service providers will have to follow or else risk losing Section 230 immunity as to CSAM claims.
This proposal does not arise in a regulatory vacuum. Theres already an existing federal statutory scheme criminalizing CSAM and imposing duties on providers. And it already allows providers to be held accountable for CSAM on their services, without any need to amend Section 230.
Federal CSAM Law
Federal law, specifically Chapter 110 of Title 18 of the U.S. Code (18 U.S.C. §§ 2251-2260A), already makes everything about CSAM a crime: producing, receiving, accessing, viewing, possessing, distributing, selling, importing, etc. If you do any of these things, the Department of Justice (DOJ) will prosecute you, and you may go to prison for many years. In addition to criminal penalties, Section 2255 of the law also authorizes civil lawsuits by victims of CSAM, so you could be sued by your victims in addition to going to prison. 
Section 2258A of the law imposes duties on online service providers, such as Facebook and Tumblr and Dropbox and Gmail. The law mandates that providers must report CSAM when they discover it on their services, and then preserve what theyve reported (because its evidence of a crime). Providers who fail to comply with this obligation face substantial (and apparently criminal) penalties payable to the federal government. U.S. v. Ackerman, 831 F.3d 1292, 1296-97 (10th Cir. 2016).[4] The statute puts the Attorney General in charge of enforcing the reporting requirements for providers. Section 2258A was recently updated in late 2018 to expand providers reporting duties. Importantly, those duties, even after the expansion, do not include any duty to proactively monitor and filter content on the service to look for CSAM. Section 2258A only requires providers to report CSAM they obtain[] actual knowledge of. 
If providers report and preserve CSAM in accordance with the law, then they are protected from legal liability (both civil and criminal, in both federal and state court) for those actions.[5] This protection, found in Section 2258B, insulates [providers] only when they … pass evidence along to law enforcement and comply with its preservation instructions. Ackerman, 831 F.3d at 1297. The safe harbor is not absolute: Section 2258B(b) disqualifies providers from protection under certain circumstances, such as if the provider engages in intentional misconduct. 
The Section 2258B safe harbor is not the same thing as Section 230 immunity. Section 230, as noted, has always had an exception for federal criminal law. In fact, the statutory text for that exception, Section 230(e)(1), expressly says, Nothing in this section shall be construed to impair the enforcement of [chapter] 110 (relating to sexual exploitation of children) of title 18, or any other Federal criminal statute. This exception is limited to criminal cases; civil claims against providers by CSAM victims under Section 2255 are still barred.[6]
Put simply, Section 230 does not keep federal prosecutors from holding providers accountable for CSAM on their services. As Techdirts Mike Masnick put it, not a single thing in CDA 230 stops the DOJ from doing anything. 
As Section 2258A requires, providers do indeed report CSAM found on their services — 45 million times last year alone, according to a high-profile September 2019 New York Times story. Theyre complying with their mandatory reporting duties, or at least, nobody seems to be accusing them of noncompliance. If major tech companies were making a practice of flouting their duties under Section 2258A, the DOJ would be pursuing massive criminal penalties against them and it would be front-page news nationwide. 
Nevertheless, despite providers apparent compliance with the law, law enforcement and child-safety organizations have been vocally asserting in recent months that providers arent doing enough to combat CSAM. Therefore, ostensibly to incentivize providers to do more, Senators Graham and Blumenthal have brought forth this new bill (which Im assuming was drafted with significant input from DOJ and child-safety groups). The bill aims to hit providers where it hurts: their Section 230 immunity. 
Summary of the EARN IT Act
The Graham/Blumenthal bills core concept is reflected in its short title: the EARN IT Act. The idea is to make providers earn Section 230 immunity for CSAM claims, by complying with a set of guidelines that would be developed by an unelected commission and could be modified unilaterally by the Attorney General, but which are not actually binding law or rules set through any legislative or agency rulemaking process. There is a lot going on in this bill, but here is a very non-exhaustive list of just some of the bills salient features, with my quick analysis under them:

  • Providers of interactive computer services must earn Section 230 immunity for CSAM
    • Bill removes immunity as to civil & state criminal claims for CSAM only; would not remove 230 immunity generally (for other claims, e.g. defamation)
    • Analysis: Interactive computer services (ICSes) are already defined in Section 230, and the bill does not change that.
      • The definition doesnt include devices, but does apparently cover messaging services, though there are few court cases about that
      • Example: bill would cover WhatsApp, Facebook Messenger, and Twitter DMs, but not iPhones
    • Analysis: ICS providers also include email providers, cloud storage providers, etc. — but this bills effect would mostly be to make them maintain the status quo, since email and cloud storage accounts already are typically encrypted in such a way that theyre still accessible to law enforcement
  • Section 230 immunity for CSAM can be earned via 1 of 2 safe harbors:
    • 1: Compliance with recommended best practices for the prevention of online child exploitation conduct, TBD by a new 15-member commission 
      • Analysis: Encryption, particularly end-to-end encryption, is likely to be targeted as being contrary to best practices for preventing CSAM, because if a provider cannot see the contents of files on its service due to encryption, it is harder to detect CSAM files. 
      • The commission would include at least 4 law enforcement reps, 4 tech industry reps, 2 reps of child safety organizations, and 2 computer scientists/software engineering experts
        • Analysis: No representative is required to speak for users or civil society.
      • The commission shall consider users interests in privacy, data security, and product quality
        • Analysis: This is very weak language; it means the commission can consider these interests for a few seconds, chuckle to themselves, and then move on.
      • The commission recommends best practices to the Attorney General, who has the power to unilaterally change them before theyre finalized, as long as he writes up some reason for the changes.
        • Analysis: This means the AG could single-handedly rewrite the best practices to state that any provider that offers end-to-end encryption is categorically excluded from taking advantage of this safe-harbor option. Or he could simply refuse to certify a set of best practices that arent sufficiently condemnatory of encryption. If the AG doesnt finalize a set of best practices, then this entire safe-harbor option just vanishes.
      • A best practice requires the approval of only 10 of the 15 commission members in order to be recommended on to the AG.
        • Analysis: This means that the commission could totally ignore both of the computer scientists, or both of the child safety org reps, or all 4 tech industry reps, so long as it can hit the 10-person quorum.
      • An officer of the provider must certify compliance with the best practices; knowing false statements are a federal felony, carrying a fine and a 2-year prison term. 
        • Analysis: The language of the certification requirement doesnt sound optional; it sounds like officers are compelled to certify, whether its true or not.
    • 2: Implementing other reasonable measures instead of the best practices
      • Unlike certifying compliance with the prescribed best practices, which guarantees Section 230 immunity, taking the reasonable measures option is not a guaranteed way of earning immunity.
      • Analysis: Its not exactly a real safe harbor if the provider still has to litigate the 230 immunity question. Providers that cant/wont/dont certify adherence to the best practices will have to take their chances on whether their chosen measures will be deemed reasonable by a court.
      • Analysis: Would a court find end-to-end encryption to be reasonable, when the goal is not data security, but instead, combating CSAM? Providers would struggle to reconcile their duty to provide reasonable data security, as imposed by the FTC and dozens of state data-security laws, with a conflicting duty not to encrypt information because its unreasonable under the EARN IT Act.

The EARN IT Act Has a Number of Serious Problems 
This bill has a number of extremely serious problems, too many to fit into one blog post. It is potentially unconstitutional under the First, Fourth, and Fifth Amendments, for one thing. I have no hope of even enumerating all of the bills deficiencies. Instead, for now, let me list just a few:

  • Thanks to Section 230(e)(1), federal prosecutors can already hold providers accountable for CSAM on their services (even if state prosecutors and civil plaintiffs cant). Piercing Section 230 immunity is not necessary if the idea is to penalize providers for their role in CSAM online, because the DOJ already has that power. If providers such as Facebook or Dropbox are breaking federal CSAM law, why isnt DOJ prosecuting them?
  • If those providers are complying with their duties under federal CSAM reporting law (Section 2258A), but DOJ and Congress still think they arent doing enough and should do even more than the law requires, why isnt the answer to simply amend Section 2258A to add additional duties? Why bring Section 230 into it?
  • The bill would, in effect, allow unaccountable commissioners to set best practices making it illegal for online service providers (for chat, email, cloud storage, etc.) to provide end-to-end encryption — something it is currently 100% legal for them to do under existing federal law, specifically CALEA. That is, the bill would make providers liable under one law for exercising their legal rights under a different law. Why isnt this conflict with CALEA acknowledged anywhere in the bill? (We saw the exact same problem with the ill-fated Burr/Feinstein attempt to indirectly ban smartphone encryption.)
  • The threat of losing Section 230 immunity will be scary to major tech companies such as Facebook that try in good faith to abide by federal CSAM law. But that threat will have no effect on the bad actors in the CSAM ecosystem: dark web sites devoted to CSAM, which already dont usually qualify for Section 230 immunity because they have a direct hand in the illegal content on their sites. 
  • If the good-faith platforms implement the new best practices to detect CSAM for fear of losing Section 230 immunity, but the bad actor CSAM sites dont, then CSAM traders will leave the good-faith platforms for the bad ones, where theyll be harder to track down.
  • The CSAM traders who do stay on the good-faith platforms (say, Facebook) will still be able to encrypt CSAM before sending it through, say, Facebook Messenger, even if Facebook Messenger itself were to no longer have any end-to-end encryption functionality. Even if the EARN IT Act bans providers from offering end-to-end encryption, that wont keep CSAM offenders from cloaking their activities with encryption. It will just move the place where the encryption happens to a different point in the process. File encryption technology is out there, and its been used by CSAM offenders for decades; the EARN IT Act bill cant change that.

Let me say a little more about these problems. But first, I want to point out to you that if you believe this bill is about finally holding Big Tech accountable after it got too big for its britches under a permissive Section 230 regime, you are being had.
The EARN IT Act Is A Bait-And-Switch
While the EARN IT Act is ostensibly aimed at Section 230, its actually a sneaky way of affecting CALEA without directly amending it. Remember the two carve-outs in CALEA that I discussed above, for encryption and information services? Both DOJ and the Federal Bureau of Investigation (FBI) have been trying for at least a decade to close them. But Congress has shown no appetite for that. As said, CALEA has never once been amended in the quarter-century since it was passed. And even with the techlash in full swell, there isnt a furious public frenzy over CALEA. Politicians know that many Americans are fed up with tech companies hiding behind Section 230 of the CDA. But nobody is saying, Im fed up with tech companies hiding behind Section 1002 of CALEA! 
So, how can law enforcement achieve its long-desired CALEA goal? By pushing a bill that talks about Section 230 instead. 
People are angry about Section 230, so the DOJ is seizing upon that anger as its opening to attack encryption. Ive been saying since 2017 that federal law enforcement agencies would take advantage of anti-Big Tech sentiment to get their way on encryption. Now the techlash is strong enough that theyre finally making their move. The bill is ostensibly taking a shot at Section 230, but that shot will ultimately land on CALEA. 
Remember, CALEA makes it perfectly legal for providers of information services (such as Apple and Facebook) to design encryption that is not law enforcement-friendly. And even telco carriers can encrypt calls and throw away the decryption key. End-to-end encryption is legal under current federal law. Yet the EARN IT Act would allow an unelected, unaccountable commission to write best practices (not actual laws or regulations, yet liability would result from failing to abide by them) which, make no mistake, will condemn end-to-end encryption. The commission, after all, would be acting in the shadow of an Attorney General who despises encryption. For Barr, encryption can only be a worst practice. By engaging in that worst practice, companies would risk facing the potentially ruinous tide of litigation that Section 230 presently bars. That is: the EARN IT Act would use one law — a narrowed Section 230 — to penalize providers for exercising their rights under a different law — CALEA. Providers would be held legally liable for doing exactly what federal law permits them to do. 
Meanwhile, providers are also evidently doing exactly what federal CSAM law compels them to do (report CSAM). As said, nobodys claiming providers are violating their reporting duties under Section 2258A. Theyre doing as the law requires. If they werent, they could already be held liable under the existing law, and Section 230 wouldnt save them. Theyre not violating those reporting duties by providing end-to-end encryption (as permitted by CALEA). As said, under Section 2258A, providers need only report CSAM they actually know about, and are under no duty to affirmatively look for CSAM by monitoring and filtering content. Nor are they under any duty to be able to see every piece of content on their service. If a provider doesnt know about a piece of CSAM because it cant see it due to end-to-end encryption, that does not run afoul of the CSAM reporting duties. 
If Congress wants to cut off tech companies and telco carriers freedom to provide end-to-end encryption under CALEAs two carve-outs, then Congress should write a bill that amends CALEA! Then we can have that conversation. If Congress wants providers to do more to fight CSAM, including something crazy like universal monitoring and filtering obligations, then Congress should write a bill that amends Section 2258A! Then we can have that conversation. But the EARN IT Act bill doesnt do either of those things.
The EARN IT Act bill is supposedly about CSAM, and its not-so-secretly about encryption. Yet it doesnt amend the laws that are actually relevant to those topics. Instead, it targets another law entirely, Section 230, largely because its politically expedient to do so right now. This bill is a cynical ploy to exploit current anti-Section 230 sentiment in order to achieve an unrelated anti-encryption goal (one which, by the way, would be disastrous for cybersecurity, privacy, the economy, national security, …). Congress should not kill the freedom to encrypt by taking advantage of Section 230s current unpopularity to get away with a bait-and-switch.
Amending 230 to affect CALEA is not the only bait-and-switch, though. Look closer: Why does law enforcement hate end-to-end encryption? Because it renders private conversations invisible to law enforcement (and to the provider of the messaging service). Now: Why is everyone mad about Section 230? Not because of private conversations — because of what’s said in public, in full view of law enforcement and the provider and everyone else. When people think of Section 230, theyre probably thinking of the horrible things theyve seen in public tweets, Facebook posts, and every comments section ever — the stuff that makes it no fun to be online anymore. Theyre probably not thinking about private conversations over chat apps. 
The vast majority of court cases involving Section 230 (where someone tries to sue a provider, but the case gets dismissed as barred by the statute) are defamation cases involving public online speech. Almost nobody is suing messaging providers because of something someone said in a private conversation. Section 230 has very, very rarely been invoked in that context,[7] and thats not what motivates the laws current unpopularity. Nevertheless, private conversations are the not-so-secret reason for the EARN IT Act proposal to amend Section 230. 
In short: This bill takes popular rage at social media companies immunity under Section 230 for public speech on their platforms, and twists it into a backhanded way of punishing messaging service providers use of encryption for private conversations. Thats the deeper bait-and-switch.
Yes, this is about CSAM, not defamation or other less-egregious offenses. But its really solely about CSAM that occurs on end-to-end encrypted private messaging services. End-to-end encrypted private messaging is the only context where theres any real need to threaten to punish providers for not doing enough about CSAM, because its the only context where providers dont already report CSAM very frequently (because they cant see the contents of end-to-end encrypted files; they can, of course, report CSAM where its reported to them by one of the ends of the communication). That 45 million reports number in the New York Times story is testament to how often providers are already reporting — and it makes it clear that its really only the specter of providers moving to encrypt more communications end-to-end (specifically, on Facebook Messenger) that is motivating this bill.
If the goal is to get providers to do more about CSAM, then there is no need to threaten to strip Section 230 immunity for CSAM thats posted in public contexts (such as a public tweet) or private messages that arent end-to-end encrypted (such as Twitter DMs), because providers already report that in accordance with federal law. And if they violated federal criminal law, then as said, Section 230 wouldnt save them. The purported rationale of incentivizing additional action on CSAM doesnt bear up to scrutiny, because providers are already doing everything they are supposed to do (and indeed more than that), wherever they have the ability to see content. Ultimately, the problem this bill is really trying to solve isnt inadequate provider action on CSAM. The problem is end-to-end encryption. 
Put another way, the problem is that there isnt currently a mandate on providers to ensure they can see every piece of content on their services, and to then affirmatively, proactively monitor and filter all of it. That is, the problem is that legislators havent yet tried to force providers to conduct total surveillance of every piece of information on their services. And rather than propose a law that explicitly demands total surveillance — something that would force elected legislators, in an election year, to be accountable to their constituents for proposing it — Senators Graham and Blumenthal are instead trying to duck accountability, hide behind Section 230s unpopularity, and instead let an unelected, unaccountable baseball teams worth of people (10 out of 15 commission members thumbs-up being the necessary minimum, as said) write best practices (again, not actual laws or even rules subject to mandatory agency rulemaking processes) that can be unilaterally rewritten by Attorney General Barr however he pleases.
The EARN IT Act Wont Stop CSAM Online
The really galling thing about this bill is that, like SESTA/FOSTA before it, it wont work. All SESTA/FOSTA did was put sex workers in more danger by making it harder for them to screen clients and share information with one another. Similarly, the likely effect of the EARN IT Act would be to induce CSAM traders to make their actions harder to detect. For one, they could still encrypt their illegal files anyway, regardless of any best practices implemented by providers. For another, theyd be incentivized to move off of big, legitimate platforms such as Facebook, which are already acting in good faith and complying with current CSAM law, and shift to dark web sites whose entire raison detre is CSAM. 
Threatening to curtail Section 230 immunity is only going to scare the good-faith providers. Those are the very ones that are already complying with Section 2258A and could be expected to comply with any additional duties Congress cared to add to 2258A. But the immunity threat will have no effect on the services that do not comply and do not care. The sites that are dedicated to CSAM are directly violating federal CSAM law. They certainly arent following Section 2258As reporting requirements. That means they already dont qualify for Section 230 immunity. As Section 230(c)(1) states, Section 230 immunity bars attempts to treat providers as the publisher or speaker of information, such as CSAM, provided by someone else. If a site helps to create or develop the illegal content, it is not eligible for the immunity. Put simply, Section 230 is not carte blanche for the provider itself to violate the law. 
Since sites devoted to CSAM already dont qualify for Section 230 immunity, threatening to take that immunity away unless they earn it under this new bill will have absolutely zero effect on them. CSAM traders, who are highly adaptable to shifts in law enforcement strategy, will know that and act accordingly. And theyll be much harder to track down on dark web sites than they are when theyre using their Facebook accounts.
Plus, the EARN IT Act wont rid even the good-faith online services of CSAM. Even if there are best practices that induce a company not to provide end-to-end encryption, users can still encrypt files before transmitting them over the companys service. Standalone file encryption software is already out there; that genie wont go back in the bottle. Punishing the provider by stripping Section 230 immunity wont fix this issue. As Section 1002(b)(3) of CALEA recognizes, it makes no sense to try to hold a company responsible for an encrypted communication if the company wasnt the one that provided the encryption.
The EARN It Act Gives More Power to a Creepy Attorney General
Last but not least, the EARN IT Act gives AG Bill Barr the unchecked power to decide what best practices providers must implement if they want to guarantee that theyll retain Section 230 immunity for CSAM.[8] He can make providers dance for him, and he calls the tune. There is approximately zero chance that Barr wont use that unilateral authority to set best practices that undermine Americans communications privacy. Right now, encryption is perfectly legal, and it stymies Barrs ability to illegally snoop en masse on Americans communications, like he did the last time he was AG. This bill feels like its motivated by Barrs wish to punish providers for frustrating his creepy desire to spy on everyone.
But if he said that out loud, it wouldnt go over well, so instead his punitive thirst is dressed up under the cover of Section 230, because Section 230 gets people riled up. Hes realized that if he invokes Section 230 in the current climate, he can get Americans to cut off their own nose to spite their face. He can get them to give away their entitlement to strong encryption that protects them from him, so long as they think thatll stick it to Big Tech.
Conclusion
There is so much wrong with the EARN IT Act bill. I hope other members of civil society will chime in against this bill to explain these many other problems, because Ive gone on long enough already and Ive only addressed one tiny piece of why this bill is such a nightmare. 
Its understandable if you have misgivings about the breadth of Section 230. Its okay to be angry at Big Tech. But dont let Senators Graham and Blumenthal dupe you into believing that the EARN IT Act would provide any vindication for you against the major tech companies. The bills ultimate intent is to penalize those companies for protecting your privacy and data security. Thats something that tech companies have been legally allowed to do for a quarter-century, and we cant afford to stop them from doing it. Encryption should be encouraged, not punished. If you value your privacy, if you value data security, or if you just dont want to see our rogue Attorney General singlehandedly set the rules for the Internet, youll contact your congressmembers and oppose the EARN IT Act.
This is a bastardization of the opening lines of
With regard to most civil and state criminal law claims, that is. Section 230 has always had a few exceptions, most significantly for federal criminal and intellectual property law.
With apologies to James Madison and Federalist #51.
The Ackerman case is mostly about Fourth Amendment issues that I wont get into, at least not today.
Absent this safe harbor, providers are put in an untenable position, because CSAM is basically radioactive. Reporting CSAM by passing it along would otherwise be a crime, because transmitting CSAM is a crime; preserving CSAM as evidence would otherwise be a crime, too, because possessing CSAM is also a crime. Section 2258B was necessary to ensure that providers dont have to face significant criminal liability for helping to fight CSAM.
Although it expressly mentions the CSAM law, Section 230(e)(1)s criminal-law exception does not allow Section 2255 civil claims against providers for CSAM content posted by users; at present, those are barred by Section 230s broad immunity for civil claims. MA ex rel. PK v. Village Voice Media Holdings, 809 F. Supp. 2d 1041, 1055-56 (E.D. Mo. 2011) (citing Doe v. Bates, No. 5-cv-91, 2006 WL 3813758 (E.D.Tex. Dec. 27, 2006)). The EARN IT Act bill would change that.
While courts have seldom addressed the applicability of Section 230 to private messaging services, a few courts have applied the law to bar claims predicated on a defendants transmission of nonpublic messages, and have done so without questioning whether [Section 230] applies in such circumstances. Fields v. Twitter, Inc., 200 F. Supp. 3d 964, 975-76 (N.D. Cal. 2016) (citing Hung Tan Phan v. Lang Van Pham, 182 Cal.App.4th 323, 324-28 (2010); Delfino v. Agilent Techs., Inc., 145 Cal.App.4th 790, 795-96, 804-08 (2006); Beyond Sys., Inc. v. Keynetics, Inc., 422 F. Supp. 2d 523, 528, 536-37 (D. Md. 2006)), affd, 881 F.3d 739 (9th Cir. 2018). Thanks to Jeff Kosseff for pointing me to this authority.
The bill also gives him the power to launch intrusive investigations of any company he can claim he believes isnt abiding by its certification to those best practices, using a tool called civil investigative demands (CIDs), but Im not even going to get into that.