Design by Allison Payne. Buy this photo.

On Oct. 3, the U.S. Supreme Court agreed to hear the case of Gonzalez v. Google LLC, a case that concerns Section 230 of the Communications Decency Act involving the 2015 ISIS terrorist attacks in Paris. The plaintiff argues that YouTube’s algorithms allowed for ISIS videos to be shown and radicalize the terrorists in the attack, resulting in deaths, and that YouTube should be held liable for their involvement in these acts. Central to this case is whether YouTube and related social media companies are legally responsible for the actions of those who post on their sites, a direct contradiction of Section 230.

As this case is being lined up for a hearing, speculation over what will happen is already being debated, specifically the future of Section 230 and whether it will be removed. A ruling which leads to a complete removal of Section 230 could prove to be detrimental to the use of the internet and social media for college students.

To understand the gravity of this Supreme Court case, it’s important to know what exactly Section 230 is and what it applies to is needed first. Appearing in the Communications Decency Act (CDA) of 1996, Section 230 states, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” For most posts made on a social media site, Section 230 shields the social media company from any liability for what is said by any of their users. The only exception to this section was made in 2018 when Congress made social media companies liable for sex trafficking that occurred through their sites.

Section 230 has greatly influenced what is seen as the modern internet. Its protection has allowed for social media to exist as it currently does, with its user-generated content and algorithms. Social media giants like Facebook, Twitter, Instagram and now TikTok all benefit from this section, allowing the user and their content to be the main part of their website. However, the user isn’t the only part of social media. In order for social media to exist as it does, algorithms come into play. These algorithms in social media work to create individual feeds for each of their users, working hand in hand with the content created by the users. While these algorithms show users the content produced by others, they are designed by social media companies, which is a major focus of the Gonzalez case.

As Gonzalez v. Google LLC goes to the Supreme Court, algorithms used by social media companies are at stake. The plaintiff in this case focuses on the algorithms used by YouTube to hold the company liable in the terrorist attacks in 2015. If found to be true in this case, one of the results could be the removal of algorithms from social media in the U.S. While this may seem reasonable for some social media sites, like Snapchat that focus more on direct contact with others, other sites are not as fortunate. Sites such as Instagram, Twitter and TikTok — whose main feature, the For You page, shows users videos in an order determined by an algorithm — are at high risk of feeling the effects of this case. 

To understand the impacts of the removal of algorithms on students, I talked to LSA junior Makayla Gillette. Gillette uses a variety of social media sites, most regularly TikTok, where she says she spends over two hours of her day. She attributes her regular use to the For You page, but furthers that “it’s scary how accurate the algorithm gets, it’s like they’ll send me content that’s so accurate to my life.”

For Gillette, her use of TikTok is heavily reliant on the For You page, so much so that she said no to using TikTok without the For You page. She expanded upon her decision, saying, “I don’t really care about people I follow (on TikTok).” Gillette demonstrates a view on TikTok she believes many of her peers share with her: using TikTok, as well as other social media sites, without an algorithm-based feed tends to be boring.

Gillette’s answers give a greater insight into just how impactful algorithms are in social media. Especially on TikTok, the For You page is why students such as Gillette use the app in the first place. The For You page being removed results in the social media site losing most of its appeal. While this might not be the case for all social media sites, it still demonstrates just how impactful the algorithm on social media can be — potentially even more impactful than the content users produce at times.

Removing algorithms on social media overall would likely create a shift in the modern structure of the internet not seen in a long time. Most social media sites that use an algorithm in one way or another would feel the greatest effects, as they would no longer be able to sustain a model that made them popular. Although this case and the final ruling are still far away, the question of whether social media can exist at all afterwards should be asked.

The removal of Section 230’s protections opens up what could be the end of social media. The loss of publisher protection for websites would likely require them to highly regulate their content and take down any post that is contested. This overprotection combined with the absence of an algorithm to keep up user retention would likely result in its eventual downfall.

Section 230 in its current form creates many of its own problems as well. Recently, social media has played some role in the radicalization of a claimed 90% of extremists, according to a 2017 research paper by the National Consortium for the Study of Terrorism. Although removing Section 230 would likely make social media companies liable for this content as well as the algorithms that promote it, the overall effect on the internet would be a harsh overreaction.

While an absolute solution wouldn’t appease the main two sides of the Section 230 debate, there is a case for minor adjustments in Section 230 to occur. Similar actions could be taken compared to the measure passed in 2018 by Congress on sex trafficking. The law held social media companies liable for the specified content by applying the Anti-Terrorism Act, which prevents the promotion of terrorist activities online.

The implementation of this act, however, would likely not be able to keep all terrorist recruitment off of the internet. Neither would a full removal of Section 230 because of the sheer size of the internet and the loopholes terrorist organizations would likely exploit. It would, however, likely greatly reduce the radicalization process, decreasing the likelihood of recruitment via the internet and social media. While Section 230 may be saved with a decision that keeps it alive, it doesn’t mean that it still shouldn’t change.

For college students, the use of social media may change completely depending on the ruling of this case. Sites like TikTok and Instagram could even cease to exist if action is taken far enough. The ease of communication through social media and its common use among students is now at risk, and if Section 230 is removed, could cease to exist.

Tom Muha is an Opinion Columnist & can be reached at