Section 230 of the Telecommunications Act of 1934 (revised 1996) (aka: Section_230) has been largely responsible for the proliferation and success of Internet-based social networking sites. It has also been attacked by both liberals and conservatives. However, their complaints against the law differ greatly. Liberals want to use it to force service providers to increase their censoring of user content; conservatives want to use it to stop such censoring. I agree with the conservatives on this one; specifically that service providers should not be granted legal immunity for user content when they act to limit or restrict access to some content.
BACKGROUND
The purpose of Section 230 can be derived from its preamble, which is identified as the subsections “Findings” and “Policy”. One specific finding provides a clue as to its intent:
(a)(3) The Internet and other interactive computer services offer a forum for a true diversity of political discourse, unique opportunities for cultural development, and myriad avenues for intellectual activity.
So at least one intent is clear: the preservation of free speech (“… a true diversity of political discourse …”) on the Internet via interactive computer services. But how exactly can the law provide for this intent?
Section 230 encourages diverse discussions on interactive services by providing legal protection to the service provider or interactive service, immunizing them from lawsuits based on the content provided by their their users:
(c)(1) TREATMENT OF PUBLISHER OR SPEAKER.–No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
Thus content is considered only a product of the actual content provider; the service provider or interactive service is considered only a distribution system. A useful distinction is the relationship between a printer and an author; you can sue an author for what they write but not their printer. Another comparison that seems more appropriate (given Section 230 is in the Telecommunications Act) is that between a telephone company and its users; you cannot sue the telephone company for what one of its users says on a phone call.
But Section 230 goes further. Reviewing other paragraphs in the “Findings” and “Policy” subsections brings additional light on the intent of the law:
(a) Findings.–The Congress finds the following:
…
(2) These services offer users a great degree of control over the information that they receive, as well as the potential for even greater control in the future as technology develops.
…
(b) POLICY.–It is the policy of the United States–
…
(3) to encourage the development of technologies which maximize user control over what information is received by individuals, families, and schools who use the Internet and other interactive computer services;
(4) to remove disincentives for the development and utilization of blocking and filtering technologies that empower parents to restrict their children’s access to objectionable or inappropriate online material;
…
[emphasis mine]
Even the title of Section 230 lends some insight into intent:
SEC. 230. [47 U.S.C. 230] PROTECTION FOR PRIVATE BLOCKING AND SCREENING OF OFFENSIVE MATERIAL. [emphasis mine]
These indicate that the law is intended to encourage the development of tools that would allow users to control the content to which they are exposed. Note that this concept of user control is well in line with our concept of free speech and the development of “… a true diversity of political discourse …”. Free speech may give you the right to speak, but it does not allow you to force others to listen. In keeping with our telecommunications theme, this would protect a user’s right to block calls from callers that the user does not want to hear. But there is an important distinction between Section 230 and our ongoing telecommunications theme: no one would expect the phone company to decide which calls you would be allowed to receive. However, this is a power granted to service providers by Section 230:
(c)(2) CIVIL LIABILITY.–No provider or user of an interactive computer service shall be held liable on account of–
(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or
(B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).
[emphasis mine]
Thus Section 230 also gives interactive service providers immunity for their actions when restricting content or access. However, it is important to note that this is contrary to the intent as outlined in Findings and Policy components of Section 230. Rather, the intent appears to be for users to be able to control their content with tools provided by the interactive service or a third party, where immunity is intended for the users and those providing such tools.
THE PROBLEM
Section 230 provides much-needed protection to interactive service providers from lawsuits arising out of user contributions. However, the protections provided to service providers for their actions restricting access or content are troublesome on several levels.
One concern is that the protections for providers with regard to access or content restriction are inconsistent with the protections for user content. The reason behind the liability protection provided to service providers under Section 230 for user content is based on the concept that the service provider is the digital equivalent of a printer, as opposed to a publisher. A publisher selects what to publish, and as a result can be held liable for content. The interactive service provider does not; the content is the product of the user, and only the user is held liable. But what about the interactive service provide who selects what not to print; doesn’t that give tacit approval to what remains (that which is printed)? Doesn’t that make them a publisher, and if so shouldn’t they be held accountable for their content? Providers should not be able to have their cake and eat it, too; they can have protection for their restrictions or for the content provided by users, but not both.
Another, greater concern is that Section 230 protections to providers for content or access restriction provide an indirect path for the government to limit speech. While restrictions are ostensibly based solely on the good faith of the service provider, how can we be certain? Both Republicans and Democrats have suggested that this law could be changed to remove some of the protections provided by Section 230, or even regulate interactive service providers. Is it possible that these suggestions are perceived as threats to the billion-dollar social media industry, and that these threats sway what content providers restrict or limit? The implication of indirect, back-door limits on speech through the threat of regulation by government or political entities is disconcerting.
Finally, the denial of a service (access restriction) to some few – particularly by a monopoly service associated with speech or communication – opens the door for abuse by those who wish to limit particular viewpoints. Social media has become the de-facto means for many individuals to communicate with others in a general way; the platforms themselves have in a sense become a method of speech. But monopolies have arisen in the industry (think Facebook, Twitter and YouTube), producing entities with no equivalent alternatives. This is due in part to the isolation between competing social media platforms; using a telecommunications theme, an equivalent condition would be where each independent telephone company was unable to route calls to users on any other phone system. In such a case there would be no reasonable alternative to the company that dominates (the one with the largest number of users). Thus, allowing providers to restrict access to these systems – particularly for politically-driven reasons – can have a dramatic impact on the “…true diversity of political discourse…” envisioned by the creators Section 230. Restricting content also brings serious concerns; can you imagine, in continuing with our earlier telephone analogy above, a telephone company that was allowed to decide which calls you would receive based on content?
Note that I am loath to suggest that independent, non-government entities be regulated unnecessarily, but when the government offers special protections for monopoly businesses it should get something in return – in this case, the free and open discussion envisioned by the authors of Section 230.
THE SOLUTION
Section 230 should be re-written to exclude protections for interactive service providers who restrict access or content. If social media companies want to control their content, so be it – but they should then be treated like the publishers they will have become.