In Gonzalez v. Google LLC , currently before the US Supreme Court, the issue at hand is determining if the Communications Decency Act Section 230, which grants certain exemptions to internet service providers in terms of liability for user-posted material, applies to the utilization of recommender systems in the context of terrorist-related content. The Court will determine if websites lose that immunity when they "recommend" content to their viewers--something that the majority of social media sites and ecommerce platforms do. The suit contends that tech firms should be held accountable for any detrimental material their algorithms advance. In this blog, we’ll explore the case and its potential implications for platforms should the Supreme Court vote in favor of the Gonzales family.
The Supreme Court Examines The Possibility of Google Being Held Accountable for Advocating ISIS Videos
In November of 2015, Paris was targeted by a series of terrorist attacks that were later claimed by the Islamic State. This tragedy resulted in the death of 130 individuals, one of them being 23-year-old student Nohemi Gonzalez who was participating in an exchange program with California State University, Long Beach. Her family filed a suit with the contention that, by recommending Islamic State-related content, Google-owned YouTube served as a recruitment tool for the group, in violation of American anti-terrorist laws.
During the lengthy hearing, Google's attorney Lisa Blatt explained that Section 230 protects the firm from accountability for the third-party videos that its suggestion algorithms share; she mentioned that this immunity is necessary for these types of companies to offer secure and helpful content to their customers. Eric Schnapper, counsel for the Gonzalez family, stated that Section 230 in connection with algorithmic recommendations incentivizes the promotion of dangerous material; he asked the court to restrict those protections.
What is Section 230 of the Communications Decency Act?
Section 230 of the Communications Decency Act is a law that provides immunity from certain types of claims to internet service providers and interactive computer services. The law states that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” This means that an interactive computer service, like Google, cannot be held liable for the content posted by its users.
The purpose of this law is to protect platforms from liability for the actions of their users. Without this law, platforms could be held liable for any content posted by their users. This would create an overwhelming burden, and would likely lead to the adoption of stricter measures on what users could post.
How Do Google's Algorithms Work?
Google's search engine uses a complex algorithm to determine which webpages to include in its search results. The algorithm is designed to prioritize pages that are most relevant to the search query, which means that pages with the most relevant information are more likely to appear at the top of the search results.
Aside from relevance, Google's algorithm takes into account other factors, such as the number of times a page has been viewed and the number of links pointing to it. This means that pages with more links and views are also more likely to appear at the top of the search results. The age of a webpage is also considered, as newer pages are more likely to appear at the top of the search results.
The algorithms are constantly being refined to ensure that the content being promoted is relevant and useful to the user. Google has stated that it doesn’t manually review the content that its algorithms promote, and therefore, can’t control what content is shown. This has been a point of contention for the Gonzalez family, who argue that Google should have more control over what content is promoted on its platforms.
What's the Status of the Case?
In October 2022, the Supreme Court agreed to consider the case as well as Twitter, Inc. v. Taamneh which was linked to Section 230 and terrorism-related material. This will be the first time the Supreme Court will be ruling on Section 230, which has been the subject of partisan critique aimed at Big Tech since at least as early as 2015. Justice Clarence Thomas had expressed his desire to examine Section 230 in earlier dissenting opinions to court decisions, stating that social media companies should be regulated similarly to "common carriers" and that content-based discrimination should be prohibited.
Numerous Big Tech organizations, and other smaller sites, submitted their own briefs to back Google in the case. Although most people agree that Section 230 should be updated to address current issues, the briefs mostly highlighted the importance of allowing Congress to pass laws rather than the Supreme Court legislating. This viewpoint was also advocated by the creators of the legislation, Ron Wyden and Christopher Cox, and the ACLU. There have also been several briefs submitted in support of Gonzales from conservatives like Josh Hawley, Ted Cruz, and the Anti-Defamation League, which argue that Google and other Big Tech companies have used Section 230 to remain immune to harmful content posted on their sites. Certain child protection organizations have also filed briefs in support of Gonzales.
On February 21, 2023, the Supreme Court heard the arguments in the Gonzalez case. Those attending the proceedings observed that the judges representing both the liberal and conservative sides were questioning the use of algorithms in the administration of internet services. There’s uncertainty about the feasibility of further distinguishing the content, as well as the possible repercussions of altering Section 230, such as a surge of lawsuits and significant economic impact.
The Bottom Line
Although the Gonzales v. Google case may be decided in favor of the plaintiffs, the law is unlikely to change. The Gonzales family will likely turn to Congress in an attempt to pass new legislation in order to address the gaps in the law. A move in the other direction would be to leave it as is. If the Court decides that Section 230 does not apply to tech companies who utilize algorithms to promote certain types of content, then the law will be weakened and it will become possible for states and other organizations to enact laws that hold these companies accountable for the content they promote.
Even if a decision doesn’t result in any significant changes to the law, it will nevertheless have an impact on the way Big Tech companies operate. After all, these platforms are driven by algorithms, so the more those algorithms learn about their users, the more useful and relevant their recommendations will be. This could impact the way their users navigate the web, as well as how their business pages are promoted on these sites. To stay on top of the latest news for platforms and marketplaces, make sure to subscribe to the Marketplace Risk Newsletter today.