Section 230: Reinterpret, don't Abolish

When they provide a curated subset of the content they believe will be most relevant or meaningful, companies create new content and become Information Content Providers as defined in Section 230 of the Communications Decency Act.

Section 230: Reinterpret, don't Abolish

*Updated 11/20/21 to clarify language from "enforcement" to "interpretation"

Table of Contents

  1. What is CDA 230?
    a. A final bit of legal definitions...
  2. What should be done about CDA 230?
    a. Reinterpretation
    b. The Tech Bakery
    c. Information Content Providers
    d. Challenges
  3. Conclusion
  4. Update (10/20/21)

Section 230 of the Communications Decency Act (commonly referred to as 'Section 230' or as 'CDA 230') is a popular and divisive topic. Chances are that if you are reading this you have at least heard of CDA 230 before.

Congressmembers and presidential candidates from both major parties in the US have called for a repeal of CDA 230, yet a broad array of voices from both parties (as well as private sector) have called for the protection of CDA 230; there is a broad spread of conflicting beliefs on this very short section of law.

While any changes to CDA 230 will significantly change the way the internet functions today, it is clear that the status quo is not acceptable. Ultimately, CDA 230 can be reinterpreted today with no changes to the law to mitigate harms while minimizing the impact to the internet, but that requires an updated understanding of how content is produced.

First, though, what is this significant little law?

What is CDA 230?

You can read the text of the law here, but the Electronic Frontier Foundation—a little like the ACLU of techy things—has written an excellent primer on CDA 230 here that I'll quote from

[Section 230 ensures that] online intermediaries that host or republish speech are protected against a range of laws that might otherwise be used to hold them legally responsible for what others say and do...In short, CDA 230 is perhaps the most influential law to protect the kind of innovation that has allowed the Internet to thrive since 1996.

A final bit of legal definitions...

The most-relevant portion of Section 230 says:

"No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." (47 U.S.C. § 230).

The law goes on to define an 'Interactive Computer Service':

(2) The term “interactive computer service” means any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions.

and an 'Information Content Provider':

(3) The term “information content provider” means any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service.

So, 'interactive computer service' users and providers are protected, while 'information content providers' are not.

With those definitions out of the way...

What should be done about CDA 230?

Reinterpretation

Section 230 should not be repealed, nor should companies like Facebook, YouTube, or Instagram be carved out of the protections provided by CDA 230. Rather, the government should apply a new interpretation of CDA 230 using its current langauge.

By providing curated recommendations, many providers of interactive computer services have instead become Information Content Providers.

These companies would like to be viewed as a newspaper stands or bookstores that are merely publishing the content of authors. That is not correct.

When they provide a curated subset of the content they believe will be most relevant or meaningful, companies create new content and become Information Content Providers as defined in CDA 230. Companies do not lose their CDA 230 protections for content provided by users, but are not coveredy by CDA 230 when they produce their own content in the form of recommendations or rankings.

Under this interpretation companies would still be protected from liability when users provide obscene content (for example, uploading a livestream of a terrorist shooting, like in Christchurch New Zealand), but companies would not be protected when they incorporate that content into their own products and present it through rankings and recommendations.

The Tech Bakery

By claiming not to provide new content, companies like TikTok, LinkedIn, and Snapchat are like bakeries that say: "we don't produce any individual content, we just ingest a bunch of content from other providers (flour, eggs, sugar, milk), run it through an algorithm (recipe), and present it."

This analogy exposes the key issue: do the algorithms used to manipulate content provided by users produce a new piece of content? or does the algorithm fail to effect material differences?

In the case of the bakery the answer is clear—a cake is clearly new content and not just re-ordered ingredients. In the case of these companies, the companies themselves help provide the answer: by classing their algorithms as trade secrets, these companies demonstrate that they believe their algorithms are used to produce new and unique works, and they want to protect that ability.

To be clear, the protection of these algorithms as trade secrets is not what makes the output of the algorithms new content (these algorithms would still produce new content even if the algorithms were public), but it demonstrates that these companies believe their algorithms produce new and unique content (the companies have not made their algorithms public to ensure their offerings are unique).

Information Content Providers

Ranking and recommendation algorithms process, analyze, and present a collection of data in a specific order and fashion. Processing, analyzing, and presenting that content is the "creation or development of information."

By viewing their algorithms as trade secrets companies demonstrate that those algorithms don't produce a generic or non-transformative collection of content provided by third parties, but instead produce a new compilation or derivative work.

By being responsible "in whole or in part" for that creation or development, companies become Information Content Providers as defined in CDA 230, and are responsible for the new content their algorithms present.

Such companies are simultaneously Interactive Computer Service providers and Information Content Providers. They should not be liable for the content provided by 'UserX,' but they should be liable for the new content (their recommendations and prioritized rankings) they produce that incorporates content from users.

Challenges

Applying this interpretation to CDA 230 will come with challenges. I've written about the key challenges I see, as well as potential solutions, here.

Conclusion

Treating companies as Information Content Providers when they produce new content, while still providing CDA 230 protections when they are functioning merely as providers of Interactive Computer Services, will mitigate the harms caused by companies through their ranking and recommendation algorithms while still ensuring a free and open internet that incentivizes speech.

Update (10/20/21)

Since I wrote this article, Congressional leaders on the Energy and Commerce Committee released draft legislation to amend Section 230. Titled the 'Justice Against Malicious Algorithms Act' or 'JAMAA,' the text can be read here.

Section 230 should not be amended or abolished: large companies should already not be receiving protections under CDA 230, and attempts to amend Section 230 in a way that curbs large tech companies while continuing to promote free speech online will fail. Especially when the government lacks the will to today to identify where CDA does or does not apply rather than giving blanket protections, attempting to amend it with hair-splitting legal definitions will play in the favor of the large companies for whom these algorithms are a vital interest.

Mike Masnick from Techdirt has released a longer article critiquing the JAMAA bill. He does a good job if identifying some specific problems with JAMAA, but stumbles when he refers to regulating "amplification:"

The algorithmic promotion that tech companies like Facebook and YouTube engage in is not amplification ("The process of increasing the volume of sound, especially using an amplifier"), but is instead more like DJing—producing an intentional and unique mix designed to please your audience.

This suggestion that algorithms are simple amplifiers that merely spread users' beliefs more broadly, without intentional design or business optimization from the developers, is the central lie in Nick Clegg's apologetic on Facebook's algorithms that I responded to earlier this year.

CDA 230 should not be abolished or amended. It can and must be reinterpreted more narrowly.


Interest piqued? Disagree? Reach out to me at TwelveTablesBlog [at] protonmail.com with your thoughts.

Photo by Sara Kurfeß on Unsplash