In 1996, Congress passed the Communications Decency Act (CDA) with Section 230, which states that United States policy is to “encourage the development of the Internet and other interactive computer services” and to “empower parents to restrict their children’s access to objectionable or inappropriate online material.”
Section 230 implements those policy goals in two key subsections. Section 230(c)(1) provides platforms with legal protection for harmful content posted by third-party users, and Section 230(c)(2) allows platforms to remove harmful content without incurring the legal responsibilities of a content publisher.
After 25 years it’s time for Congress to examine how well those subsections are accomplishing the law’s stated goals—particularly those of empowering parents and protecting children.
A Wall Street Journal article published last week provides insight into that question. The article, titled “How TikTok Serves Up Sex and Drug Videos to Minors,” discusses how the Journal’s reporters created 31 TikTok accounts registered as belonging to users aged 13 to 15, turned those accounts loose to browse videos on TikTok’s platform, and observed what videos TikTok’s algorithms recommends to the supposed teenaged owners of those accounts.
In one case, the Journal found that TikTok recommended “at least 569 videos about drug use, references to cocaine and meth addition, and promotional videos for online sales of drug products and paraphernalia” to an account registered as belonging to a 13-year-old. In recommending other videos, TikTok pushed the Journal’s supposed teenage users into what the Journal described as a “rabbit hole that many users call ‘KinkTok,’ featuring whips, chains and torture devices.” In doing so, TikTok recommended at least 2,800 videos that had been restricted to “adults only” by the users who had posted those videos.
This is only the most recent example of how online user-generated content platforms are harming children. A Senate investigation in 2017 found that Backpage.com had “knowingly” participated in sex trafficking, including trafficking of minor victims, and in 2020 the New York Times reported that searches for “girls under 18” and “14yo” on the website Pornhub returned over 100,000 videos, including many that depicted “child abuse and nonconsensual violence.”
All of these examples have one thing in common: The companies involved were all shielded from possible liability based on the legal protections provided in Section 230.
Last month we published an article in the Harvard Business Review arguing that Section 230 needs to be reformed by restoring a common-law “duty of care” standard to the law, and that this reform can be enacted in a way that maintains the law’s original policy goals of creating a vibrant market for user-generated content while still empowering parents and protecting children.
–Michael D. Smith, J. Erik Jonsson Professor of Information Technology and Marketing at Carnegie Mellon University’s H. John Heinz III College, and Marshall Van Alstyne, Questrom Professor of Information Economics at the Boston University Questrom School of Business