Illustration of Mark Zuckerberg at congressional hearing with his eyes censored by a black bar.
Design by Caroline Guenther

“I’m sorry for everything you’ve been through,” said Mark Zuckerberg, Meta founder and CEO, in the most recent congressional hearing on online child safety. Disingenuous as these words may be, they are more than just an apology from the tech mogul. They signify the culmination of failed government action, a rise in companies set on making record profits and a blatant disregard for social media users.

The “Protecting Our Children Online” committee hearing, held by the Senate Judiciary Committee on Jan. 31, was the most recent installment in a series of congressional hearings involving the leaders of companies in the digital sphere. Previous hearings of this type included a hearing in 2018 with Mark Zuckerberg over data privacy, one in 2021 concerning misinformation throughout the internet and another in 2023 about content moderation on X, formerly known as Twitter. Much like the previous three hearings mentioned, the recent child safety hearing has yet to produce any legislative results and will likely not do so unless there is a major bipartisan effort. 

As Congress once again debates regulations for content on the internet, the need for government action is only growing. In 1996, Section 230 of the Communications Decency Act was passed, overhauling U.S. telecommunications and internet regulations. The act, however, has only been adjusted significantly once since its passing. The internet deserves a policy shift. The government needs to change the makeup of Section 230 of the Communications Decency Act by adding consequences for the owners of websites, especially for websites where issues of abuse and safety are largely prevalent.

In order to understand why a law passed nearly three decades ago needs to change, we first need to understand what exactly the law has enabled on the internet. At its core, Section 230 removes liability from website owners for the content posted on their site. Its impact, however, has been much larger than anticipated.

Section 230’s absence of liability for websites isn’t just the enabler for the modern internet — it is the modern internet. Sites from YouTube to Instagram to TikTok and even Pinterest rely on this section for their existence. Section 230 enables each of these sites — along with millions more message boards, social media platforms and other sites where users can create their own material — to exist without worry of what content users produce. 

The current internet and the idea of Web 2.0, an internet where user-created content is front and center, owe their existence to this very act. However, just because there is no liability on the part of the companies that host the content created doesn’t mean every user is immune. In cases where the First Amendment is violated or in cases of libel and defamation, the user is directly at fault. Website owners, on the other hand, don’t face the same conviction for the content on their site.

Throughout the internet’s growth, adjustments to the regulation of content on behalf of the government started and ended with Section 230, until 2018. In 2018, two acts, The Stop Enabling Sex Traffickers Act and the Allow States and Victims to Fight Online Sex Trafficking Act, stripped the protections of Section 230 from websites hosting sex trafficking. 

Currently, social media faces a growing amount of child abuse content on their sites each year — a testament to how the lack of regulation fosters dangerous digital environment. While social media companies may claim that much of this content is against their terms of service, the only definite way to ensure no website houses similar content is through a revision of Section 230.

Modifying a law that essentially created the modern internet may appear to be a great undertaking, but as shown with the passing of SESTA and FOSTA in 2018, it’s possible. While debates about what content is and is not damaging to the safety of children — the specifics of the act — are presumably ongoing, the act must, in any event, amend Section 230. This would enable the government to enforce the act and to repeat this process with other issues faced in internet-focused hearings, putting an end to the endless hearings we see today. Amending Section 230 would protect child safety online by holding the owners of these websites accountable. 

This change would have a large impact on how we regulate the internet because it adds another layer of government regulation, while also proving that internet regulation is a dynamic process. On top of paving the way for more regulation, social media companies in particular would be subject to enforcing regulations more uniformly, rather than each company deciding what and what not to regulate. The current system leads to large gaps in user safety because the regulations on one site are less restrictive than another. 

But change on the internet — as common as it may be — is not always appreciated, especially by owners such as Mark Zuckerberg, who is open to revisions in Section 230 only if it benefits him. Zuckerberg’s view of a revised Section 230 is one where companies with regulation systems are immune from liability. These regulation systems are algorithms built to decide what content is allowed to be posted on a given site, and, in Zuckerberg’s proposal, these algorithms would need to be built to filter out what the government deems as inappropriate. However, this stipulates that in cases where Facebook’s regulation system fails and allows for a piece of content that is deemed inappropriate, Facebook is not held liable for that inappropriate piece of content.

Although this proposed revision may seem reasonable at first glance, it allows for companies that have the means to have superior regulation systems to continue to exist as they are with no consequences. In addition, this proposal would also make it more difficult for smaller companies to gain a foothold in the sector. The established giants of industry could easily change their algorithms to adapt to new regulations. Smaller companies, on the other hand, may need to build these systems from scratch.

A change in Section 230 will inevitably change the internet — but this isn’t necessarily a bad thing. The government will require companies to monitor how they regulate their content, which will, in turn, pave the way for an internet that doesn’t harm as many users as it hosts. While amending Section 230 piece-by-piece might be a tedious endeavor, it may be the only way we can secure internet safety.

Thomas Muha is an Opinion Columnist who writes about the legal and economic issues facing technology and the internet. He can reached at tmuha@umich.edu or on X at @TJMooUM