Does Section 230 Provide Platforms Too Much or Too Little Immunity?

Does Section 230 Provide Platforms Too Much or Too Little Immunity?

Perhaps the most prominent current technology policy debates concern “big tech,” specifically Facebook, Twitter, Amazon, and Google.  Perhaps the most politically contentious issue among them involves the ability of these platforms to control content without being liable for the content they select.  This has recently been the focus of a civil suit filed by former president Donald Trump against many of these platforms for suspending his access to them.  This centers around the infamous Section 230, part of the Telecommunications Act of 1996, passed when the Internet was barely a shadow of what it has become today.  The specific focus of attention is in Sec. 230(c)(1):

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

This phrase has been described as “the twenty-six words that created the Internet,” reflecting the view that on-line platforms of all sorts would not have been able to develop were they under constant threat of lawsuits akin to those faced by the publisher of a book or newspaper.

This provision has become the target of those who believe platforms have too much power over information.  While criticism is bipartisan, the nature of the criticism is polarized.  Progressives argue that Sec. 230 allows platforms to spread false claims.  Conservatives believes that Sec. 230 enables platforms to censor speakers and speech, and that platforms use this power disproportionately against conservative voices.

This debate has taken on greater prominence since Facebook and Twitter “deplatformed” then President Trump following the insurrection at the U.S. Capitol to disrupt the Senate’s official certification of the Electoral College vote designating current President Biden as winner of the 2020 election.  This deplatforming is widely taken by conservatives, recently including Supreme Court Justice Clarence Thomas, as justifying efforts to limit the power of these platforms to control the content that goes over them.  And, as noted above, it is the subject of a law suit filed by Mr. Trump.

These competing worldviews leave the platforms between the proverbial rock and hard place: When they leave some content up they anger progressives, and when they take some content down they anger conservatives. Unfortunately, it is often the same content that one side wants down and the other wants up.

Rather than limiting platform liability, as many believe Sec. 230’s primary role to be, some parts of Sec. 230 may, in fact, create potential problems for platforms by limiting Sec. 230(c)’s protection. Specifically, Sec. 230(e), which gets little attention, states:

Nothing in this section shall be construed to impair the enforcement of section 223 or 231 of this title, chapter 71 (relating to obscenity) or 110 (relating to sexual exploitation of children) of title 18, or any other Federal criminal statute.

The key phrase is the last six words, “or any other Federal criminal statute”.  I am not intimately familiar with the history of litigation under Sec. 230 generally, but my sense is that they are about the extent of platforms’ immunity for what they carried, not their justifications for what they didn’t carry.   

I assume that forced entry into the Capitol Building with fatalities and injuries to members of the Capitol Police and property destruction, violates numerous Federal criminal statutes.  If so, Twitter and Facebook may have asked themselves, in the wake of Jan. 6, whether they would risk being the targets of “the enforcement of … any other Federal criminal statute” were they to continue to give President Trump an outlet for speech allegedly inciting further similar actions. In other words, while the platforms were, on the one hand, immune because of Sec. 230(c) (and, arguably, the First Amendment), they may have worried about prosecution for conduct not immunized by Sec. 230(e).

If this is an explanation, a natural next question is why the platforms haven’t invoked it against arguments to strip Sec. 230 immunity.  After all, if the platforms are facing threats of legislation to subject more of their conduct to government oversight, one would think they would invoke Sec. 230(e) as a justification for their actions.  I speculate that they worry that such an argument could open a floodgate of lawsuits by confessing potential liability for content they allowed to be transmitted before Jan. 6 that allegedly fostered the riot.  What other federal crimes were committed that someone would try to connect to social media posts?  It will be interesting, to say the least, if the platforms invoke 230(e) as a defense in Mr. Trump’s lawsuit, and the case becomes in substance if not effect a trial of Mr. Trump’s culpability under the Insurrection Act.

Ultimately, though, none of these arguments may matter. Until Congress repeals the First Amendment, private platforms have Constitutional protection regardless of Section 230.  Is there any policy that progressives and conservatives might both favor? A favorite remedy is requiring platforms to be “transparent” in how they make deplatforming decisions. However, that only begs the question.  Transparency will make platforms who censor vulnerable to accusations of failing to honor commitments if those platforms had advertised that they do not censor.  However, transparency will also make platforms who distribute false information similarly vulnerable if they had claimed they do not spread demonstrably incorrect claims. 

Sadly, the result of these paradoxes could be two parallel Internet universes, one dedicated to veracity and the other to openness. They may not be able to coexist given that every four years these universes have to come together to choose a common president and even more frequently for other elections.  Based on recent experience, let’s hope we don’t create such a multiverse.  We’re already perilously close.

Share This Article

big tech, content moderation, section 230

View More Publications by Tim Brennan

Recommended Reads

Related Articles

Sign Up for Updates

This field is for validation purposes and should be left unchanged.